Apr 16 16:20:03.445163 ip-10-0-132-246 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:20:03.445177 ip-10-0-132-246 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:20:03.445186 ip-10-0-132-246 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:20:03.445542 ip-10-0-132-246 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:20:13.548076 ip-10-0-132-246 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:20:13.548093 ip-10-0-132-246 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6cfd27c567814facbdc6a7d64faee914 -- Apr 16 16:22:19.132923 ip-10-0-132-246 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:22:19.631804 ip-10-0-132-246 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:19.631804 ip-10-0-132-246 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:22:19.631804 ip-10-0-132-246 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:19.631804 ip-10-0-132-246 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:22:19.631804 ip-10-0-132-246 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:22:19.636110 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.636019 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:22:19.639318 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639292 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:19.639318 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639317 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639323 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639335 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639340 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639344 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639349 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639354 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639358 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639362 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639367 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639372 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639376 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639380 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639384 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639402 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639423 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639429 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639435 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639440 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639448 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:19.639461 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639453 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639458 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639463 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639468 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639473 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639484 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639489 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639493 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639497 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639502 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639507 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639514 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639518 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639524 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639530 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639535 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639539 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639548 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639553 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639557 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:19.640084 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639561 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639565 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639569 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639574 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639578 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639582 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639586 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639590 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639597 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639602 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639612 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639616 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639621 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639626 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639631 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639637 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639662 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639674 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639677 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:19.640661 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639680 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639683 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639707 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639721 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639731 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639740 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639746 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639752 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639757 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639762 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639767 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639772 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639777 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639781 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639792 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639834 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639963 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639967 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639972 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:19.641113 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639978 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639982 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639987 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639990 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639993 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639996 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.639999 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640392 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640397 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640400 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640402 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640405 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640408 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640411 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640414 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640416 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640419 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640423 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640426 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640429 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:19.641581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640432 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640434 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640437 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640439 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640442 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640445 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640448 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640450 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640454 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640457 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640459 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640462 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640465 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640468 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640470 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640473 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640476 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640478 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640481 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640483 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:19.642074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640486 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640489 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640491 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640494 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640497 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640500 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640503 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640505 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640508 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640511 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640513 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640516 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640518 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640521 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640523 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640526 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640528 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640532 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640535 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640537 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:19.642575 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640541 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640544 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640546 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640549 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640552 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640554 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640557 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640560 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640562 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640566 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640568 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640570 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640573 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640575 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640578 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640580 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640583 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640586 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640588 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:19.643078 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640591 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640593 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640595 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640598 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640600 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640603 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640605 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640608 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640610 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640613 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640617 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640622 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640625 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.640628 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642198 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642209 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642215 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642219 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642225 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642229 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642233 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:22:19.643568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642238 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642241 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642244 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642247 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642251 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642254 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642257 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642260 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642263 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642266 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642268 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642271 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642276 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642279 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642282 2577 flags.go:64] FLAG: --config-dir="" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642285 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642288 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642292 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642295 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642298 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642301 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642305 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642308 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642311 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642314 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:22:19.644087 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642318 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642323 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642326 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642329 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642332 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642336 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642339 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642343 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642346 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642348 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642351 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642354 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642358 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642361 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642364 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642367 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642370 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642373 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642376 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642379 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642382 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642384 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642387 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642391 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642395 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:22:19.644699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642398 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642401 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642404 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642407 2577 flags.go:64] FLAG: --help="false" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642411 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-246.ec2.internal" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642414 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642417 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642420 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642423 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642426 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642430 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642433 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642435 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642439 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642442 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642445 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642447 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642458 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642463 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642467 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642469 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642472 2577 flags.go:64] FLAG: --lock-file="" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642475 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642478 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:22:19.645342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642481 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642487 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642489 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642492 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642495 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642498 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642501 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642504 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642507 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642511 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642514 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642518 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642521 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642524 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642527 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642530 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642533 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642551 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642555 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642562 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642566 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642569 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642572 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:22:19.645920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642576 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642581 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642584 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642587 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642590 2577 flags.go:64] FLAG: --port="10250" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642593 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642596 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08d1e729acff00516" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642599 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642602 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642605 2577 flags.go:64] FLAG: --register-node="true" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642608 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642611 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642615 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642617 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642620 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642623 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642627 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642629 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642632 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642635 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642638 2577 flags.go:64] FLAG: --runonce="false" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642658 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642662 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642665 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642668 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642671 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:22:19.646497 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642674 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642677 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642681 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642683 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642686 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642689 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642693 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642696 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642699 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642701 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642707 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642709 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642712 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642718 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642721 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642723 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642726 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642729 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642732 2577 flags.go:64] FLAG: --v="2" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642736 2577 flags.go:64] FLAG: --version="false" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642743 2577 flags.go:64] FLAG: --vmodule="" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642747 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.642751 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642843 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:19.647137 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642848 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642851 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642854 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642858 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642861 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642863 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642866 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642870 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642872 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642875 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642878 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642881 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642884 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642887 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642890 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642893 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642896 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642899 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642902 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642904 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:19.647722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642907 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642910 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642912 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642914 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642917 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642920 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642922 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642925 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642927 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642930 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642932 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642935 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642937 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642940 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642943 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642946 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642948 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642951 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642954 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642957 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:19.648228 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642959 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642961 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642964 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642966 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642969 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642971 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642974 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642976 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642979 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642981 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642984 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642986 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642991 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642994 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642996 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.642999 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643001 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643004 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643006 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:19.648722 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643009 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643012 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643014 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643017 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643019 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643022 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643025 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643027 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643034 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643037 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643039 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643043 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643047 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643050 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643052 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643055 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643058 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643061 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643064 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643066 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:19.649224 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643069 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643072 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643074 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643077 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643080 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.643084 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:19.650085 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.643842 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:19.652693 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.652675 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:22:19.652693 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.652695 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652777 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652785 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652790 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652796 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652801 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652806 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652810 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652815 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652820 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652825 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652829 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652833 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652837 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652841 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:19.652838 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652846 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652850 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652854 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652859 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652863 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652867 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652871 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652876 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652880 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652884 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652887 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652893 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652897 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652901 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652905 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652909 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652913 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652920 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652926 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:19.653488 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652931 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652936 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652941 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652945 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652949 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652954 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652958 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652962 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652967 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652971 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652975 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652979 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652983 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652987 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652991 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.652996 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653001 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653005 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653009 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653013 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:19.654151 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653017 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653021 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653025 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653029 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653035 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653041 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653046 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653051 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653055 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653059 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653063 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653067 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653071 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653075 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653080 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653084 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653088 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653092 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653096 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:19.654742 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653101 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653105 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653110 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653114 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653118 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653121 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653126 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653130 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653134 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653138 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653143 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653147 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653151 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653156 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.653164 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:19.655197 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653329 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653336 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653341 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653346 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653352 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653356 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653360 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653364 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653369 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653373 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653377 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653381 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653386 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653390 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653394 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653402 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653406 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653411 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:22:19.655767 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653415 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653419 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653423 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653427 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653431 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653436 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653440 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653444 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653448 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653452 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653456 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653461 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653465 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653469 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653473 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653477 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653481 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653485 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653493 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653499 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:22:19.656225 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653503 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653507 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653512 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653516 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653520 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653525 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653529 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653533 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653538 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653542 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653546 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653551 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653555 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653560 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653564 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653568 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653572 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653578 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653584 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:22:19.656797 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653589 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653593 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653597 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653602 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653606 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653610 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653615 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653619 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653623 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653627 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653631 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653635 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653662 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653667 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653671 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653675 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653679 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653683 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653688 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653692 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:22:19.657593 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653696 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653700 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653704 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653709 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653712 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653716 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653720 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:19.653725 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.653733 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:22:19.658193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.654558 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:22:19.658577 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.658562 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:22:19.659694 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.659682 2577 server.go:1019] "Starting client certificate rotation" Apr 16 16:22:19.659793 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.659776 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:22:19.660753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.660741 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:22:19.688649 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.688629 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:22:19.690421 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.690404 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:22:19.711058 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.711039 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:22:19.722068 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.722048 2577 log.go:25] "Validated CRI v1 image API" Apr 16 16:22:19.723559 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.723531 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:22:19.723735 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.723721 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:22:19.728017 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.727997 2577 fs.go:135] Filesystem UUIDs: map[5db05822-37de-4b27-b419-cc271cc164b3:/dev/nvme0n1p3 69a958d5-fa6a-452d-85ba-e303fe3ae610:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 16:22:19.728091 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.728017 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:22:19.733881 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.733774 2577 manager.go:217] Machine: {Timestamp:2026-04-16 16:22:19.731662895 +0000 UTC m=+0.462901302 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3130575 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27d005ed3d0e6936576b983dc3fc66 SystemUUID:ec27d005-ed3d-0e69-3657-6b983dc3fc66 BootID:6cfd27c5-6781-4fac-bdc6-a7d64faee914 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bf:c4:be:85:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bf:c4:be:85:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:60:aa:f3:d5:06 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:22:19.733881 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.733877 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:22:19.733978 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.733953 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:22:19.735137 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735109 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:22:19.735273 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735141 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-246.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:22:19.735315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735283 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:22:19.735315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735291 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:22:19.735315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735309 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:22:19.735387 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.735326 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:22:19.737095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.737083 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:22:19.737200 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.737191 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:22:19.739873 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.739862 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:22:19.739913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.739880 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:22:19.739913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.739892 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:22:19.739913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.739902 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:22:19.740012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.739933 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:22:19.741117 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.741104 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:22:19.741176 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.741124 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:22:19.745304 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.745288 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:22:19.747412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.747395 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:22:19.749361 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749346 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749365 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749376 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749383 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749393 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749401 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749410 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749418 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749434 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:22:19.749444 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749443 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:22:19.749748 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749461 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:22:19.749748 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.749475 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:22:19.750499 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.750488 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:22:19.750546 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.750502 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:22:19.754398 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754382 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:22:19.754484 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754444 2577 server.go:1295] "Started kubelet" Apr 16 16:22:19.754484 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754465 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-246.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:22:19.754581 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.754478 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:22:19.754581 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.754509 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-246.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:22:19.754581 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754548 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:22:19.754581 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754551 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:22:19.754750 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.754610 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:22:19.755199 ip-10-0-132-246 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:22:19.756083 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.756060 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:22:19.757350 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.757336 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:22:19.762856 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.762836 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:22:19.763335 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.763314 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:22:19.763500 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.763481 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:22:19.767745 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.767715 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:22:19.767902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.767811 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:22:19.768003 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.767993 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:22:19.768161 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.762929 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-246.ec2.internal.18a6e2dc8280b7b7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-246.ec2.internal,UID:ip-10-0-132-246.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-246.ec2.internal,},FirstTimestamp:2026-04-16 16:22:19.754395575 +0000 UTC m=+0.485633984,LastTimestamp:2026-04-16 16:22:19.754395575 +0000 UTC m=+0.485633984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-246.ec2.internal,}" Apr 16 16:22:19.768279 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768267 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:22:19.768332 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768280 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:22:19.768540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768523 2577 factory.go:55] Registering systemd factory Apr 16 16:22:19.768626 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768571 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:22:19.768897 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768885 2577 factory.go:153] Registering CRI-O factory Apr 16 16:22:19.768897 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768898 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 16:22:19.769009 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.768946 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:22:19.769061 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.769022 2577 factory.go:103] Registering Raw factory Apr 16 16:22:19.769061 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.769033 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 16:22:19.769877 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.769861 2577 manager.go:319] Starting recovery of all containers Apr 16 16:22:19.770806 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.770778 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:19.776617 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.776450 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:22:19.777053 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.777026 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-246.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:22:19.781714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.781700 2577 manager.go:324] Recovery completed Apr 16 16:22:19.783398 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.783380 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6lslw" Apr 16 16:22:19.785274 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.785262 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.788209 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788191 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.788280 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788218 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.788280 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788229 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:19.788611 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788599 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:22:19.788611 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788611 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:22:19.788749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.788629 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:22:19.789427 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.789409 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6lslw" Apr 16 16:22:19.790388 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.790316 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-246.ec2.internal.18a6e2dc84849d5b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-246.ec2.internal,UID:ip-10-0-132-246.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-246.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-246.ec2.internal,},FirstTimestamp:2026-04-16 16:22:19.788205403 +0000 UTC m=+0.519443809,LastTimestamp:2026-04-16 16:22:19.788205403 +0000 UTC m=+0.519443809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-246.ec2.internal,}" Apr 16 16:22:19.791830 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.791815 2577 policy_none.go:49] "None policy: Start" Apr 16 16:22:19.791916 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.791834 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:22:19.791916 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.791847 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:22:19.828248 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828231 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.828262 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828271 2577 server.go:85] "Starting device plugin registration server" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828474 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828483 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828594 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828671 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.828677 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.829216 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:22:19.837633 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.829246 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:19.873570 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.873546 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:22:19.874875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.874851 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:22:19.874875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.874876 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:22:19.875002 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.874895 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:22:19.875002 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.874902 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:22:19.875002 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.874934 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:22:19.877974 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.877959 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:19.928895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.928854 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.930040 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.930025 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.930092 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.930055 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.930092 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.930065 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:19.930092 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.930085 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-246.ec2.internal" Apr 16 16:22:19.938462 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.938449 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-246.ec2.internal" Apr 16 16:22:19.938519 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.938469 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-246.ec2.internal\": node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:19.951951 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:19.951933 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:19.975987 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.975953 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal"] Apr 16 16:22:19.976047 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.976021 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.978017 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.977999 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.978103 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.978027 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.978103 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.978041 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:19.979231 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.979217 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.979378 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.979365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:19.979433 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.979391 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.980093 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980077 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.980187 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980092 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.980187 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980121 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.980187 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980134 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:19.980187 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980101 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.980187 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.980186 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:19.982330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.982312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:19.982403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.982347 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:22:19.983580 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.983568 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:22:19.983676 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.983616 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:22:19.983676 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:19.983628 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:22:20.006203 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.006178 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-246.ec2.internal\" not found" node="ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.009353 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.009339 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-246.ec2.internal\" not found" node="ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.052469 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.052449 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.070840 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.070820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d155529b212b181b4766962e45b3a8b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-246.ec2.internal\" (UID: \"3d155529b212b181b4766962e45b3a8b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.070902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.070844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.070902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.070860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.153166 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.153144 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.171637 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d155529b212b181b4766962e45b3a8b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-246.ec2.internal\" (UID: \"3d155529b212b181b4766962e45b3a8b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.171732 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.171732 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.171809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.171809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99fc9fe75bcc54dc55f1a2907880eaa3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal\" (UID: \"99fc9fe75bcc54dc55f1a2907880eaa3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.171809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.171750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d155529b212b181b4766962e45b3a8b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-246.ec2.internal\" (UID: \"3d155529b212b181b4766962e45b3a8b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.254070 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.254011 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.309648 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.309614 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.311249 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.311230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.355046 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.355019 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.455656 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.455615 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.556287 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.556206 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.656780 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.656752 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.659912 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.659895 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:22:20.660029 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.660014 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:22:20.757697 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:20.757679 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-246.ec2.internal\" not found" Apr 16 16:22:20.763787 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.763767 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:22:20.766544 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.766522 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:20.768528 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.768507 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.775746 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.775725 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:22:20.782986 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.782958 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:22:20.785296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.785275 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" Apr 16 16:22:20.791948 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.791923 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:17:19 +0000 UTC" deadline="2027-11-12 09:23:16.082693156 +0000 UTC" Apr 16 16:22:20.791948 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.791947 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13793h0m55.290748405s" Apr 16 16:22:20.800944 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.800926 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:22:20.803891 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.803874 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h4pxc" Apr 16 16:22:20.812352 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.812312 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h4pxc" Apr 16 16:22:20.859542 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:20.859493 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d155529b212b181b4766962e45b3a8b.slice/crio-cb1ee3decbf065e2816d2aa32c59524e077ce79a3fa2ce17ec1e641359af5f55 WatchSource:0}: Error finding container cb1ee3decbf065e2816d2aa32c59524e077ce79a3fa2ce17ec1e641359af5f55: Status 404 returned error can't find the container with id cb1ee3decbf065e2816d2aa32c59524e077ce79a3fa2ce17ec1e641359af5f55 Apr 16 16:22:20.859977 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:20.859940 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99fc9fe75bcc54dc55f1a2907880eaa3.slice/crio-4b525711daa64b71963555be0379e691a54043b4696d5d03648241cd68a9e4b6 WatchSource:0}: Error finding container 4b525711daa64b71963555be0379e691a54043b4696d5d03648241cd68a9e4b6: Status 404 returned error can't find the container with id 4b525711daa64b71963555be0379e691a54043b4696d5d03648241cd68a9e4b6 Apr 16 16:22:20.864719 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.864705 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:22:20.878000 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.877947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" event={"ID":"99fc9fe75bcc54dc55f1a2907880eaa3","Type":"ContainerStarted","Data":"4b525711daa64b71963555be0379e691a54043b4696d5d03648241cd68a9e4b6"} Apr 16 16:22:20.878981 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.878962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" event={"ID":"3d155529b212b181b4766962e45b3a8b","Type":"ContainerStarted","Data":"cb1ee3decbf065e2816d2aa32c59524e077ce79a3fa2ce17ec1e641359af5f55"} Apr 16 16:22:20.991953 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:20.991926 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:21.136716 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.136657 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:21.741377 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.741348 2577 apiserver.go:52] "Watching apiserver" Apr 16 16:22:21.747947 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.747927 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:22:21.750257 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.750233 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal","openshift-multus/multus-additional-cni-plugins-b7fht","openshift-multus/network-metrics-daemon-2wd9w","openshift-ovn-kubernetes/ovnkube-node-8zb4q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2","openshift-cluster-node-tuning-operator/tuned-b9rpz","openshift-multus/multus-wmqwd","openshift-network-diagnostics/network-check-target-58j4z","openshift-network-operator/iptables-alerter-qgb2m","kube-system/konnectivity-agent-lqxs7","kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal","openshift-dns/node-resolver-jchwl","openshift-image-registry/node-ca-628rf"] Apr 16 16:22:21.754485 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.753764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.754855 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.754833 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:21.754930 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.754903 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:21.754970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.754952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.755040 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.755018 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:21.756458 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.756428 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.756559 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.756545 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.756963 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.756922 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gc99q\"" Apr 16 16:22:21.757172 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.757150 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:22:21.757248 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.757196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:22:21.757379 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.757364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.759190 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.759169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.761108 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.761274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.761277 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.761748 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.761858 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.762266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mdr2r\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.762668 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.763498 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:22:21.764026 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.763596 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.764757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.764111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.764757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.764157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.765945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.765803 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.765945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.765820 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.765945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.765832 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.765945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.765881 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:22:21.765945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.765905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gsv58\"" Apr 16 16:22:21.766240 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.766054 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9hv2k\"" Apr 16 16:22:21.767238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.767220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:22:21.767318 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.767286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t7wn4\"" Apr 16 16:22:21.769075 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.769055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.769970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.769953 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:22:21.770043 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.769978 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:22:21.770336 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.770321 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.770433 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.770403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8hgpb\"" Apr 16 16:22:21.770616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.770603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.771035 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.771020 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.771624 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.771608 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dnx9b\"" Apr 16 16:22:21.771719 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.771685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.772115 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.771793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.772115 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.772066 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.774072 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nx66x\"" Apr 16 16:22:21.774168 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:22:21.774454 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774270 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:22:21.774454 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hd8q2\"" Apr 16 16:22:21.774605 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:22:21.774968 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.774948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:22:21.775062 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.775031 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:22:21.779234 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cnibin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779317 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglbz\" (UniqueName: \"kubernetes.io/projected/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-kube-api-access-wglbz\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779317 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779317 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-lib-modules\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-tmp\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-log-socket\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrrn\" (UniqueName: \"kubernetes.io/projected/9494261b-183d-4f87-ae51-80217757eafa-kube-api-access-rkrrn\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.779434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-var-lib-kubelet\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-k8s-cni-cncf-io\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779496 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-conf-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-host\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-system-cni-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-os-release\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.779603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lpzm\" (UniqueName: \"kubernetes.io/projected/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-kube-api-access-9lpzm\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779625 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-netns\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-sys\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-socket-dir-parent\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-netd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-kubelet\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-device-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-run\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-netns\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.779899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-ovn\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-os-release\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cni-binary-copy\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.779990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-daemon-config\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-modprobe-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-etc-tuned\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-env-overrides\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-script-lib\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-node-log\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9q8\" (UniqueName: \"kubernetes.io/projected/c55af322-e15c-4305-a18a-33df63e34cb9-kube-api-access-9s9q8\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-sys-fs\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c8bcc83b-1135-49eb-a662-f76883a04c53-konnectivity-ca\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.780295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-kubelet\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-slash\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-etc-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-bin\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-multus-certs\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysconfig\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-host-slash\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-multus\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-config\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7d5z\" (UniqueName: \"kubernetes.io/projected/a648a078-0f71-4a2f-a255-ad1937929932-kube-api-access-s7d5z\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780473 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-registration-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-conf\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-system-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-var-lib-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.780818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-kubernetes\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-systemd\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5prk\" (UniqueName: \"kubernetes.io/projected/566f5317-730d-4bea-9936-998ff669835f-kube-api-access-l5prk\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-iptables-alerter-script\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780731 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c8bcc83b-1135-49eb-a662-f76883a04c53-agent-certs\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780778 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-bin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9494261b-183d-4f87-ae51-80217757eafa-hosts-file\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780927 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-socket-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.780956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggtg7\" (UniqueName: \"kubernetes.io/projected/98177e32-055b-446a-807e-b424fddaca83-kube-api-access-ggtg7\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-cnibin\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9494261b-183d-4f87-ae51-80217757eafa-tmp-dir\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whmbz\" (UniqueName: \"kubernetes.io/projected/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kube-api-access-whmbz\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781224 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-hostroot\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.781491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-etc-kubernetes\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.782182 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-systemd-units\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.782182 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-systemd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.782182 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.781311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c55af322-e15c-4305-a18a-33df63e34cb9-ovn-node-metrics-cert\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.813242 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.813172 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:17:20 +0000 UTC" deadline="2028-01-15 05:38:02.637134886 +0000 UTC" Apr 16 16:22:21.813242 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.813200 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15325h15m40.823938294s" Apr 16 16:22:21.869475 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.869446 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:22:21.881739 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-hostroot\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-etc-kubernetes\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-systemd-units\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-systemd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-hostroot\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-systemd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.881875 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c55af322-e15c-4305-a18a-33df63e34cb9-ovn-node-metrics-cert\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-systemd-units\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-etc-kubernetes\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cnibin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cnibin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wglbz\" (UniqueName: \"kubernetes.io/projected/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-kube-api-access-wglbz\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.881999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff2dc9df-f5a6-47e5-9597-5d45855573cd-host\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882155 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-lib-modules\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-tmp\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-log-socket\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrrn\" (UniqueName: \"kubernetes.io/projected/9494261b-183d-4f87-ae51-80217757eafa-kube-api-access-rkrrn\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-log-socket\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-lib-modules\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-var-lib-kubelet\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-k8s-cni-cncf-io\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.882502 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:21.882608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-conf-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-host\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-k8s-cni-cncf-io\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-var-lib-kubelet\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-conf-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882500 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.882701 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:22.382667088 +0000 UTC m=+3.113905508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-host\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-system-cni-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-os-release\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lpzm\" (UniqueName: \"kubernetes.io/projected/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-kube-api-access-9lpzm\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-system-cni-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.882901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-os-release\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-netns\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-sys\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-socket-dir-parent\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-netd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-sys\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-netns\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-netd\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-socket-dir-parent\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-kubelet\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-device-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-run\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-netns\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-kubelet\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-ovn\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-run\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-netns\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-device-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-os-release\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-run-ovn\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.883985 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cni-binary-copy\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-daemon-config\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-os-release\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-modprobe-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-etc-tuned\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-env-overrides\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-script-lib\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvv9\" (UniqueName: \"kubernetes.io/projected/ff2dc9df-f5a6-47e5-9597-5d45855573cd-kube-api-access-lfvv9\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-modprobe-d\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-node-log\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9q8\" (UniqueName: \"kubernetes.io/projected/c55af322-e15c-4305-a18a-33df63e34cb9-kube-api-access-9s9q8\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-sys-fs\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.884821 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c8bcc83b-1135-49eb-a662-f76883a04c53-konnectivity-ca\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-kubelet\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-slash\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-etc-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-bin\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-multus-certs\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysconfig\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-host-slash\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-cni-binary-copy\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-kubelet\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-slash\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.883987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-multus\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-etc-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-host-cni-bin\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-multus\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-config\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7d5z\" (UniqueName: \"kubernetes.io/projected/a648a078-0f71-4a2f-a255-ad1937929932-kube-api-access-s7d5z\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.885563 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-script-lib\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-registration-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-node-log\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-registration-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-run-multus-certs\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysconfig\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-host-slash\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-sys-fs\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-env-overrides\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-conf\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-daemon-config\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-system-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.884993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-var-lib-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-kubernetes\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-systemd\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.886403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c8bcc83b-1135-49eb-a662-f76883a04c53-konnectivity-ca\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-system-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-multus-cni-dir\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-sysctl-conf\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-systemd\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c55af322-e15c-4305-a18a-33df63e34cb9-var-lib-openvswitch\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98177e32-055b-446a-807e-b424fddaca83-etc-kubernetes\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5prk\" (UniqueName: \"kubernetes.io/projected/566f5317-730d-4bea-9936-998ff669835f-kube-api-access-l5prk\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-iptables-alerter-script\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c8bcc83b-1135-49eb-a662-f76883a04c53-agent-certs\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-bin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9494261b-183d-4f87-ae51-80217757eafa-hosts-file\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-socket-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggtg7\" (UniqueName: \"kubernetes.io/projected/98177e32-055b-446a-807e-b424fddaca83-kube-api-access-ggtg7\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-socket-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-cnibin\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.887196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c55af322-e15c-4305-a18a-33df63e34cb9-ovnkube-config\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-host-var-lib-cni-bin\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9494261b-183d-4f87-ae51-80217757eafa-tmp-dir\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-cnibin\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9494261b-183d-4f87-ae51-80217757eafa-hosts-file\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ff2dc9df-f5a6-47e5-9597-5d45855573cd-serviceca\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.885618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whmbz\" (UniqueName: \"kubernetes.io/projected/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kube-api-access-whmbz\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.886077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-iptables-alerter-script\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.886103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/566f5317-730d-4bea-9936-998ff669835f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.886111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/566f5317-730d-4bea-9936-998ff669835f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.888164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.886263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9494261b-183d-4f87-ae51-80217757eafa-tmp-dir\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.888853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.888827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-etc-tuned\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.889564 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.889540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c55af322-e15c-4305-a18a-33df63e34cb9-ovn-node-metrics-cert\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.889965 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.889945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c8bcc83b-1135-49eb-a662-f76883a04c53-agent-certs\") pod \"konnectivity-agent-lqxs7\" (UID: \"c8bcc83b-1135-49eb-a662-f76883a04c53\") " pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:21.892158 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.892136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglbz\" (UniqueName: \"kubernetes.io/projected/53a4fb09-6477-4a78-b6b9-b6dfa2c3499a-kube-api-access-wglbz\") pod \"multus-wmqwd\" (UID: \"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a\") " pod="openshift-multus/multus-wmqwd" Apr 16 16:22:21.894206 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.894185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/98177e32-055b-446a-807e-b424fddaca83-tmp\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.895299 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.895257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrrn\" (UniqueName: \"kubernetes.io/projected/9494261b-183d-4f87-ae51-80217757eafa-kube-api-access-rkrrn\") pod \"node-resolver-jchwl\" (UID: \"9494261b-183d-4f87-ae51-80217757eafa\") " pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:21.896387 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.896096 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:21.896387 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.896118 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:21.896387 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.896131 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:21.896387 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:21.896192 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:22.396175703 +0000 UTC m=+3.127414118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:21.898819 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.898782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5prk\" (UniqueName: \"kubernetes.io/projected/566f5317-730d-4bea-9936-998ff669835f-kube-api-access-l5prk\") pod \"multus-additional-cni-plugins-b7fht\" (UID: \"566f5317-730d-4bea-9936-998ff669835f\") " pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:21.899438 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.899418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7d5z\" (UniqueName: \"kubernetes.io/projected/a648a078-0f71-4a2f-a255-ad1937929932-kube-api-access-s7d5z\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:21.899514 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.899447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9q8\" (UniqueName: \"kubernetes.io/projected/c55af322-e15c-4305-a18a-33df63e34cb9-kube-api-access-9s9q8\") pod \"ovnkube-node-8zb4q\" (UID: \"c55af322-e15c-4305-a18a-33df63e34cb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:21.900912 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.900890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmbz\" (UniqueName: \"kubernetes.io/projected/af95b57e-af8d-4931-8c1f-c676a0f9ebfd-kube-api-access-whmbz\") pod \"aws-ebs-csi-driver-node-txbv2\" (UID: \"af95b57e-af8d-4931-8c1f-c676a0f9ebfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:21.901227 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.901203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lpzm\" (UniqueName: \"kubernetes.io/projected/e7b517b4-d60a-4fc6-9d1b-018cfb3630fc-kube-api-access-9lpzm\") pod \"iptables-alerter-qgb2m\" (UID: \"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc\") " pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:21.901568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.901553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggtg7\" (UniqueName: \"kubernetes.io/projected/98177e32-055b-446a-807e-b424fddaca83-kube-api-access-ggtg7\") pod \"tuned-b9rpz\" (UID: \"98177e32-055b-446a-807e-b424fddaca83\") " pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:21.986455 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.986411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvv9\" (UniqueName: \"kubernetes.io/projected/ff2dc9df-f5a6-47e5-9597-5d45855573cd-kube-api-access-lfvv9\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.986634 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.986490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ff2dc9df-f5a6-47e5-9597-5d45855573cd-serviceca\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.986634 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.986525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff2dc9df-f5a6-47e5-9597-5d45855573cd-host\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.986634 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.986598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff2dc9df-f5a6-47e5-9597-5d45855573cd-host\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.986992 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.986974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ff2dc9df-f5a6-47e5-9597-5d45855573cd-serviceca\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:21.995084 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:21.995025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvv9\" (UniqueName: \"kubernetes.io/projected/ff2dc9df-f5a6-47e5-9597-5d45855573cd-kube-api-access-lfvv9\") pod \"node-ca-628rf\" (UID: \"ff2dc9df-f5a6-47e5-9597-5d45855573cd\") " pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:22.068009 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.067976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wmqwd" Apr 16 16:22:22.075983 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.075962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:22.083370 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.083349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" Apr 16 16:22:22.092095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.092076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" Apr 16 16:22:22.099670 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.099636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b7fht" Apr 16 16:22:22.107410 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.107393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgb2m" Apr 16 16:22:22.112360 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.112335 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:22:22.113402 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.113384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jchwl" Apr 16 16:22:22.121015 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.120985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-628rf" Apr 16 16:22:22.127505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.127488 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:22.388492 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.388406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:22.388665 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.388563 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:22.388665 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.388636 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:23.388615889 +0000 UTC m=+4.119854285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:22.489295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.489266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:22.489454 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.489437 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:22.489548 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.489460 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:22.489548 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.489469 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:22.489548 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:22.489528 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:23.489509315 +0000 UTC m=+4.220747739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:22.665074 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.665052 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98177e32_055b_446a_807e_b424fddaca83.slice/crio-253bf9fccae24c4035267a92574e0382c6fc6c50d33aafb5f598a49ed68d6703 WatchSource:0}: Error finding container 253bf9fccae24c4035267a92574e0382c6fc6c50d33aafb5f598a49ed68d6703: Status 404 returned error can't find the container with id 253bf9fccae24c4035267a92574e0382c6fc6c50d33aafb5f598a49ed68d6703 Apr 16 16:22:22.666519 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.666431 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf95b57e_af8d_4931_8c1f_c676a0f9ebfd.slice/crio-73b4f82487c59571a93c9a0a7e7e97961f3e25d84fa5882d76b5ede6b3193f54 WatchSource:0}: Error finding container 73b4f82487c59571a93c9a0a7e7e97961f3e25d84fa5882d76b5ede6b3193f54: Status 404 returned error can't find the container with id 73b4f82487c59571a93c9a0a7e7e97961f3e25d84fa5882d76b5ede6b3193f54 Apr 16 16:22:22.667273 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.667249 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2dc9df_f5a6_47e5_9597_5d45855573cd.slice/crio-0dbd228101fb8d79064f52950bce39d783f124239cf5889f8a7830ad03275719 WatchSource:0}: Error finding container 0dbd228101fb8d79064f52950bce39d783f124239cf5889f8a7830ad03275719: Status 404 returned error can't find the container with id 0dbd228101fb8d79064f52950bce39d783f124239cf5889f8a7830ad03275719 Apr 16 16:22:22.669511 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.669482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc55af322_e15c_4305_a18a_33df63e34cb9.slice/crio-87a610e60e59e38e6504e0c1ac2285e457d65b42c2d9f8df6ba3942a78900d16 WatchSource:0}: Error finding container 87a610e60e59e38e6504e0c1ac2285e457d65b42c2d9f8df6ba3942a78900d16: Status 404 returned error can't find the container with id 87a610e60e59e38e6504e0c1ac2285e457d65b42c2d9f8df6ba3942a78900d16 Apr 16 16:22:22.671035 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.671011 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod566f5317_730d_4bea_9936_998ff669835f.slice/crio-8919c96503b9cc0d55e68c0049a831976d849166cb24c3092a14744f21e8808f WatchSource:0}: Error finding container 8919c96503b9cc0d55e68c0049a831976d849166cb24c3092a14744f21e8808f: Status 404 returned error can't find the container with id 8919c96503b9cc0d55e68c0049a831976d849166cb24c3092a14744f21e8808f Apr 16 16:22:22.672131 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.672047 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a4fb09_6477_4a78_b6b9_b6dfa2c3499a.slice/crio-0c768f328e58eb981e6cea0127492b848d4c1356af63cac2944c4c95b3456e01 WatchSource:0}: Error finding container 0c768f328e58eb981e6cea0127492b848d4c1356af63cac2944c4c95b3456e01: Status 404 returned error can't find the container with id 0c768f328e58eb981e6cea0127492b848d4c1356af63cac2944c4c95b3456e01 Apr 16 16:22:22.673110 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.673084 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9494261b_183d_4f87_ae51_80217757eafa.slice/crio-2f79a150c469e28c3b4a3ac11bfc2edb5b096533ea9a9a2d1fa99595907f92d9 WatchSource:0}: Error finding container 2f79a150c469e28c3b4a3ac11bfc2edb5b096533ea9a9a2d1fa99595907f92d9: Status 404 returned error can't find the container with id 2f79a150c469e28c3b4a3ac11bfc2edb5b096533ea9a9a2d1fa99595907f92d9 Apr 16 16:22:22.674168 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.674144 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b517b4_d60a_4fc6_9d1b_018cfb3630fc.slice/crio-c36ff529f0cea0a27dd5f817269fd6006a1b506719fafeb795c842c57e512b05 WatchSource:0}: Error finding container c36ff529f0cea0a27dd5f817269fd6006a1b506719fafeb795c842c57e512b05: Status 404 returned error can't find the container with id c36ff529f0cea0a27dd5f817269fd6006a1b506719fafeb795c842c57e512b05 Apr 16 16:22:22.675241 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:22.675218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8bcc83b_1135_49eb_a662_f76883a04c53.slice/crio-71c67cfe401cc4736967b6fe87d255cab5605ce1f99cd7a3e4fd482947725dd5 WatchSource:0}: Error finding container 71c67cfe401cc4736967b6fe87d255cab5605ce1f99cd7a3e4fd482947725dd5: Status 404 returned error can't find the container with id 71c67cfe401cc4736967b6fe87d255cab5605ce1f99cd7a3e4fd482947725dd5 Apr 16 16:22:22.814369 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.814228 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:17:20 +0000 UTC" deadline="2027-10-29 09:11:05.648407221 +0000 UTC" Apr 16 16:22:22.814369 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.814365 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13456h48m42.834046924s" Apr 16 16:22:22.882867 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.882827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lqxs7" event={"ID":"c8bcc83b-1135-49eb-a662-f76883a04c53","Type":"ContainerStarted","Data":"71c67cfe401cc4736967b6fe87d255cab5605ce1f99cd7a3e4fd482947725dd5"} Apr 16 16:22:22.883805 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.883777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmqwd" event={"ID":"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a","Type":"ContainerStarted","Data":"0c768f328e58eb981e6cea0127492b848d4c1356af63cac2944c4c95b3456e01"} Apr 16 16:22:22.884757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.884732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerStarted","Data":"8919c96503b9cc0d55e68c0049a831976d849166cb24c3092a14744f21e8808f"} Apr 16 16:22:22.885738 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.885718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"87a610e60e59e38e6504e0c1ac2285e457d65b42c2d9f8df6ba3942a78900d16"} Apr 16 16:22:22.886564 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.886547 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" event={"ID":"98177e32-055b-446a-807e-b424fddaca83","Type":"ContainerStarted","Data":"253bf9fccae24c4035267a92574e0382c6fc6c50d33aafb5f598a49ed68d6703"} Apr 16 16:22:22.887843 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.887823 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" event={"ID":"3d155529b212b181b4766962e45b3a8b","Type":"ContainerStarted","Data":"a431b423866c2d82df908f6f1397547fc0530b1d4657e0003d3e08dd18232818"} Apr 16 16:22:22.888793 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.888775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jchwl" event={"ID":"9494261b-183d-4f87-ae51-80217757eafa","Type":"ContainerStarted","Data":"2f79a150c469e28c3b4a3ac11bfc2edb5b096533ea9a9a2d1fa99595907f92d9"} Apr 16 16:22:22.889687 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.889669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgb2m" event={"ID":"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc","Type":"ContainerStarted","Data":"c36ff529f0cea0a27dd5f817269fd6006a1b506719fafeb795c842c57e512b05"} Apr 16 16:22:22.890496 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.890478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-628rf" event={"ID":"ff2dc9df-f5a6-47e5-9597-5d45855573cd","Type":"ContainerStarted","Data":"0dbd228101fb8d79064f52950bce39d783f124239cf5889f8a7830ad03275719"} Apr 16 16:22:22.891254 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:22.891235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" event={"ID":"af95b57e-af8d-4931-8c1f-c676a0f9ebfd","Type":"ContainerStarted","Data":"73b4f82487c59571a93c9a0a7e7e97961f3e25d84fa5882d76b5ede6b3193f54"} Apr 16 16:22:23.395276 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.395242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:23.395447 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.395374 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:23.395447 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.395436 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:25.395416543 +0000 UTC m=+6.126654940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:23.496525 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.496471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:23.496745 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.496696 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:23.496745 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.496719 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:23.496745 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.496732 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:23.496927 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.496798 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:25.496780356 +0000 UTC m=+6.228018755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:23.877600 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.877094 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:23.877600 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.877219 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:23.878339 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.878193 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:23.878339 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:23.878297 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:23.900009 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.899979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" event={"ID":"99fc9fe75bcc54dc55f1a2907880eaa3","Type":"ContainerStarted","Data":"f4f0c6220bb4a63c20b54943c708ef5babb5018e54235f1d46e78ae7495f70aa"} Apr 16 16:22:23.912276 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:23.912228 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-246.ec2.internal" podStartSLOduration=3.91221495 podStartE2EDuration="3.91221495s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:22.905041582 +0000 UTC m=+3.636279998" watchObservedRunningTime="2026-04-16 16:22:23.91221495 +0000 UTC m=+4.643453367" Apr 16 16:22:24.907722 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:24.907684 2577 generic.go:358] "Generic (PLEG): container finished" podID="99fc9fe75bcc54dc55f1a2907880eaa3" containerID="f4f0c6220bb4a63c20b54943c708ef5babb5018e54235f1d46e78ae7495f70aa" exitCode=0 Apr 16 16:22:24.908159 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:24.907743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" event={"ID":"99fc9fe75bcc54dc55f1a2907880eaa3","Type":"ContainerDied","Data":"f4f0c6220bb4a63c20b54943c708ef5babb5018e54235f1d46e78ae7495f70aa"} Apr 16 16:22:25.416094 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:25.416050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:25.416288 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.416219 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:25.416288 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.416280 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:29.41626269 +0000 UTC m=+10.147501089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:25.517939 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:25.517267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:25.517939 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.517490 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:25.517939 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.517509 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:25.517939 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.517521 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:25.517939 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.517576 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:29.517558036 +0000 UTC m=+10.248796435 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:25.875859 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:25.875771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:25.876042 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.875903 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:25.876259 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:25.875771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:25.876392 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:25.876361 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:27.875845 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:27.875787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:27.876272 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:27.875957 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:27.876272 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:27.876020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:27.876272 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:27.876153 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:29.450635 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:29.450595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:29.451105 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.450735 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:29.451105 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.450800 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:37.450785016 +0000 UTC m=+18.182023410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:29.551413 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:29.551373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:29.551619 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.551572 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:29.551619 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.551601 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:29.551619 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.551615 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:29.551804 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.551698 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:37.551677212 +0000 UTC m=+18.282915626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:29.878292 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:29.878221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:29.878292 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:29.878232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:29.878514 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.878350 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:29.878514 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:29.878495 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:31.880771 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:31.880729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:31.881078 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:31.880858 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:31.881078 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:31.881005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:31.881228 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:31.881119 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:33.875915 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:33.875876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:33.876371 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:33.875934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:33.876371 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:33.876004 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:33.876371 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:33.876136 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:35.875306 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:35.875271 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:35.875779 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:35.875401 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:35.875779 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:35.875413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:35.875779 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:35.875538 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:37.506876 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:37.506838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:37.507428 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.506967 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:37.507428 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.507027 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.507013039 +0000 UTC m=+34.238251433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:22:37.607278 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:37.607245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:37.607528 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.607377 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:22:37.607528 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.607399 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:22:37.607528 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.607411 2577 projected.go:194] Error preparing data for projected volume kube-api-access-wztjn for pod openshift-network-diagnostics/network-check-target-58j4z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:37.607528 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.607479 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn podName:0dcde09b-f2df-44aa-b593-4fc117b2e8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.607459009 +0000 UTC m=+34.338697404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wztjn" (UniqueName: "kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn") pod "network-check-target-58j4z" (UID: "0dcde09b-f2df-44aa-b593-4fc117b2e8f7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:22:37.875803 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:37.875727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:37.875958 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.875850 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:37.875958 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:37.875910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:37.876063 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:37.876031 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:39.878082 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:39.877478 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:39.878082 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:39.877570 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:39.878495 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:39.878115 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:39.878495 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:39.878215 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:39.947354 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:39.946325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" event={"ID":"99fc9fe75bcc54dc55f1a2907880eaa3","Type":"ContainerStarted","Data":"cd6f5caeb5f4168f05416b10677a3a86a3fd86a5ed069c1b82aec056776901b2"} Apr 16 16:22:40.951528 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.951105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lqxs7" event={"ID":"c8bcc83b-1135-49eb-a662-f76883a04c53","Type":"ContainerStarted","Data":"4d72b8de75aeb5c59c7af4c9c163f4b8643bcdba984457ef87f70326ee4ec8f4"} Apr 16 16:22:40.952574 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.952533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wmqwd" event={"ID":"53a4fb09-6477-4a78-b6b9-b6dfa2c3499a","Type":"ContainerStarted","Data":"175a956e03d15c26b9c9af62f5f38178b25ce0468ff0f6a54d5c4d417c5836ed"} Apr 16 16:22:40.954058 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.954029 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="732caa988946c0a42f085d2444bbc21485778bf2457daa21aa1cfbf5bd3d8c7e" exitCode=0 Apr 16 16:22:40.954249 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.954116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"732caa988946c0a42f085d2444bbc21485778bf2457daa21aa1cfbf5bd3d8c7e"} Apr 16 16:22:40.958996 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.958972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"229ee6bd9c73727baa30bcbca2981d38bf26226f1f7ee474156dfadd03f8fb31"} Apr 16 16:22:40.958996 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.959001 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"2deaecba901b7ecdb89635393b3bfc2f2c286663f3c5e79ba0de77a52fcf4d92"} Apr 16 16:22:40.959160 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.959013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"7b90feff890ae8a002a37785c9e6f7494799cb0e547919844388fbf8189fc3ef"} Apr 16 16:22:40.959160 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.959023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"6d55db157bf70110ab73307485e88426973d5ff55076308474f7e581b7586feb"} Apr 16 16:22:40.959160 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.959030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"a818e992178be79cae0b456d58c09bb0b4954ae88ac898e677d317b404b72dbe"} Apr 16 16:22:40.960420 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.960399 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" event={"ID":"98177e32-055b-446a-807e-b424fddaca83","Type":"ContainerStarted","Data":"315cbbe6bd2f760908f244960c1805d63a5574a2b8d1e7902e5a9b56d7e014c0"} Apr 16 16:22:40.961841 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.961816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jchwl" event={"ID":"9494261b-183d-4f87-ae51-80217757eafa","Type":"ContainerStarted","Data":"089c638531eb2a9c43f3b2cf0bed0eb7de51826907da2f4ed8a1730824995aab"} Apr 16 16:22:40.963274 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.963242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-628rf" event={"ID":"ff2dc9df-f5a6-47e5-9597-5d45855573cd","Type":"ContainerStarted","Data":"8e5e6e020682bcb77d848346eb93ae5940bd838e19de770d6a49707ec9a109e6"} Apr 16 16:22:40.964855 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.964819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" event={"ID":"af95b57e-af8d-4931-8c1f-c676a0f9ebfd","Type":"ContainerStarted","Data":"c8d76298b30bf7240f147ed159bc9a365fd4e68b5da3a6f97067f9ad2ca7bf7e"} Apr 16 16:22:40.965748 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.965711 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-246.ec2.internal" podStartSLOduration=20.965699703 podStartE2EDuration="20.965699703s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:39.961306022 +0000 UTC m=+20.692544439" watchObservedRunningTime="2026-04-16 16:22:40.965699703 +0000 UTC m=+21.696938118" Apr 16 16:22:40.966431 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.966386 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lqxs7" podStartSLOduration=3.902056513 podStartE2EDuration="20.966376906s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.677626867 +0000 UTC m=+3.408865261" lastFinishedPulling="2026-04-16 16:22:39.741947259 +0000 UTC m=+20.473185654" observedRunningTime="2026-04-16 16:22:40.965590152 +0000 UTC m=+21.696828568" watchObservedRunningTime="2026-04-16 16:22:40.966376906 +0000 UTC m=+21.697615325" Apr 16 16:22:40.978552 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:40.978513 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b9rpz" podStartSLOduration=3.8806451280000003 podStartE2EDuration="20.978501339s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.666861836 +0000 UTC m=+3.398100242" lastFinishedPulling="2026-04-16 16:22:39.764718051 +0000 UTC m=+20.495956453" observedRunningTime="2026-04-16 16:22:40.978047531 +0000 UTC m=+21.709285948" watchObservedRunningTime="2026-04-16 16:22:40.978501339 +0000 UTC m=+21.709739755" Apr 16 16:22:41.007002 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.006956 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jchwl" podStartSLOduration=3.919187402 podStartE2EDuration="21.006944478s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.675480101 +0000 UTC m=+3.406718500" lastFinishedPulling="2026-04-16 16:22:39.763237181 +0000 UTC m=+20.494475576" observedRunningTime="2026-04-16 16:22:40.991441987 +0000 UTC m=+21.722680404" watchObservedRunningTime="2026-04-16 16:22:41.006944478 +0000 UTC m=+21.738182873" Apr 16 16:22:41.029800 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.029733 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-628rf" podStartSLOduration=8.432070632 podStartE2EDuration="21.029721936s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.670635083 +0000 UTC m=+3.401873482" lastFinishedPulling="2026-04-16 16:22:35.268286389 +0000 UTC m=+15.999524786" observedRunningTime="2026-04-16 16:22:41.007522726 +0000 UTC m=+21.738761142" watchObservedRunningTime="2026-04-16 16:22:41.029721936 +0000 UTC m=+21.760960352" Apr 16 16:22:41.052254 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.052209 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wmqwd" podStartSLOduration=4.917676866 podStartE2EDuration="22.05219732s" podCreationTimestamp="2026-04-16 16:22:19 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.673889594 +0000 UTC m=+3.405128005" lastFinishedPulling="2026-04-16 16:22:39.808410056 +0000 UTC m=+20.539648459" observedRunningTime="2026-04-16 16:22:41.051804077 +0000 UTC m=+21.783042492" watchObservedRunningTime="2026-04-16 16:22:41.05219732 +0000 UTC m=+21.783435735" Apr 16 16:22:41.452629 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.452607 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:22:41.841347 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.841242 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:22:41.452622509Z","UUID":"4eefcb96-5eee-4f2d-9609-b5d1aea12b6e","Handler":null,"Name":"","Endpoint":""} Apr 16 16:22:41.845809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.845392 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:22:41.845809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.845429 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:22:41.875979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.875948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:41.876129 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:41.876061 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:41.876766 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.876750 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:41.876843 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:41.876828 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:41.970235 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.970195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"859376082e8daf9c67ec745b856486fa5eda72b2ec8421293d82716ce55a52b3"} Apr 16 16:22:41.971710 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.971686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgb2m" event={"ID":"e7b517b4-d60a-4fc6-9d1b-018cfb3630fc","Type":"ContainerStarted","Data":"e5fea25c2075924d2e87fcc0e3ff9b7fb2a8d69608b59fe12187f7297f5b9065"} Apr 16 16:22:41.976264 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.976237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" event={"ID":"af95b57e-af8d-4931-8c1f-c676a0f9ebfd","Type":"ContainerStarted","Data":"1eb80aa465c6ec05a831f1f93d250a04acabecb041f7f58a39f855297051fcfc"} Apr 16 16:22:41.994591 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:41.994522 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qgb2m" podStartSLOduration=4.908535226 podStartE2EDuration="21.99450903s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.676724714 +0000 UTC m=+3.407963110" lastFinishedPulling="2026-04-16 16:22:39.762698503 +0000 UTC m=+20.493936914" observedRunningTime="2026-04-16 16:22:41.994306815 +0000 UTC m=+22.725545231" watchObservedRunningTime="2026-04-16 16:22:41.99450903 +0000 UTC m=+22.725747438" Apr 16 16:22:42.215315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.215287 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:42.216037 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.216007 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:42.980065 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.980029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" event={"ID":"af95b57e-af8d-4931-8c1f-c676a0f9ebfd","Type":"ContainerStarted","Data":"78caea6680258049fd939db5538f14d8da045315330ca18745ef7d8610de5517"} Apr 16 16:22:42.980547 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.980238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:42.980757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.980738 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lqxs7" Apr 16 16:22:42.998185 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:42.998135 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-txbv2" podStartSLOduration=3.171499607 podStartE2EDuration="22.998119247s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.670276417 +0000 UTC m=+3.401514817" lastFinishedPulling="2026-04-16 16:22:42.496896062 +0000 UTC m=+23.228134457" observedRunningTime="2026-04-16 16:22:42.997691284 +0000 UTC m=+23.728929699" watchObservedRunningTime="2026-04-16 16:22:42.998119247 +0000 UTC m=+23.729357665" Apr 16 16:22:43.875416 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:43.875230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:43.875629 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:43.875307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:43.875629 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:43.875498 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:43.875629 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:43.875618 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:43.985582 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:43.985545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"75e60ed5d32b95b151fca4d72145af33d2b1fb26628a9550623d55ff327dd0f7"} Apr 16 16:22:45.875931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.875773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:45.876780 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:45.875996 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:45.876780 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.875786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:45.876780 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:45.876077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:45.990831 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.990803 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="9af967c584c2d491cb9b03ef450f3d5794e513b617796a9227569b14a8f11b5a" exitCode=0 Apr 16 16:22:45.990958 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.990867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"9af967c584c2d491cb9b03ef450f3d5794e513b617796a9227569b14a8f11b5a"} Apr 16 16:22:45.994135 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.994075 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" event={"ID":"c55af322-e15c-4305-a18a-33df63e34cb9","Type":"ContainerStarted","Data":"04b82dd0a488b2427b3d20c2756659e2283076acc133f1d4c1a6d6f5eecc40b6"} Apr 16 16:22:45.994453 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:45.994432 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:46.009313 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:46.009292 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:46.033844 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:46.033806 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" podStartSLOduration=8.632032055 podStartE2EDuration="26.033793599s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.671760074 +0000 UTC m=+3.402998470" lastFinishedPulling="2026-04-16 16:22:40.073521606 +0000 UTC m=+20.804760014" observedRunningTime="2026-04-16 16:22:46.0333801 +0000 UTC m=+26.764618516" watchObservedRunningTime="2026-04-16 16:22:46.033793599 +0000 UTC m=+26.765032015" Apr 16 16:22:46.997778 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:46.997750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerStarted","Data":"9a2504159f5994dc8ad92155097589aa59f26c5472ae4af95f3a76096fd2995f"} Apr 16 16:22:46.998127 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:46.997987 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:46.998445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:46.998427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:47.013663 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:47.013623 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:22:47.334615 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:47.334583 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-58j4z"] Apr 16 16:22:47.334798 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:47.334709 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:47.334798 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:47.334787 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:47.337178 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:47.337154 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2wd9w"] Apr 16 16:22:47.337273 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:47.337249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:47.337338 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:47.337322 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:48.000790 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:48.000757 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="9a2504159f5994dc8ad92155097589aa59f26c5472ae4af95f3a76096fd2995f" exitCode=0 Apr 16 16:22:48.001205 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:48.000839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"9a2504159f5994dc8ad92155097589aa59f26c5472ae4af95f3a76096fd2995f"} Apr 16 16:22:48.875571 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:48.875547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:48.875771 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:48.875593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:48.875771 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:48.875692 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:48.875771 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:48.875735 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:49.004517 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:49.004443 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="4834ecf7d595ef659389b94df7db868f376923426fda49c2cd80a082dc4959ea" exitCode=0 Apr 16 16:22:49.004942 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:49.004528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"4834ecf7d595ef659389b94df7db868f376923426fda49c2cd80a082dc4959ea"} Apr 16 16:22:50.875270 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:50.875107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:50.875700 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:50.875110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:50.875700 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:50.875349 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-58j4z" podUID="0dcde09b-f2df-44aa-b593-4fc117b2e8f7" Apr 16 16:22:50.875700 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:50.875452 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:22:52.141156 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.141120 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7d55g"] Apr 16 16:22:52.143050 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.143023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.143172 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.143114 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7d55g" podUID="9adfc688-8cd4-4e19-b964-829c6ec785ff" Apr 16 16:22:52.152974 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.152936 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7d55g"] Apr 16 16:22:52.228040 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.228004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-kubelet-config\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.228040 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.228051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-dbus\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.228268 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.228080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.328947 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.328915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-kubelet-config\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.328947 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.328951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-dbus\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.329177 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.328970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.329177 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.329052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-kubelet-config\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.329177 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.329078 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:22:52.329177 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.329138 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret podName:9adfc688-8cd4-4e19-b964-829c6ec785ff nodeName:}" failed. No retries permitted until 2026-04-16 16:22:52.829124824 +0000 UTC m=+33.560363218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret") pod "global-pull-secret-syncer-7d55g" (UID: "9adfc688-8cd4-4e19-b964-829c6ec785ff") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:22:52.329385 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.329240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9adfc688-8cd4-4e19-b964-829c6ec785ff-dbus\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.535699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.535675 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-246.ec2.internal" event="NodeReady" Apr 16 16:22:52.535827 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.535793 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:22:52.576483 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.576450 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd"] Apr 16 16:22:52.578604 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.578575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.580977 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.580952 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp"] Apr 16 16:22:52.582836 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.582817 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:22:52.583096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.583411 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583284 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:22:52.583661 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583619 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:22:52.583757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583676 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:22:52.583757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583708 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8zl6q\"" Apr 16 16:22:52.583864 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.583780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:22:52.584368 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.584349 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs"] Apr 16 16:22:52.584541 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.584520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.585899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.585878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.587352 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:22:52.587460 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587358 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:22:52.587618 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:22:52.587715 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:22:52.587715 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:22:52.587715 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587697 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:22:52.587882 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.587864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:22:52.588335 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.588012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dwggv\"" Apr 16 16:22:52.588911 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.588886 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:22:52.592605 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.592587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:22:52.594990 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.594973 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8nvl6"] Apr 16 16:22:52.596629 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.596611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd"] Apr 16 16:22:52.596736 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.596723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.596827 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.596808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp"] Apr 16 16:22:52.597706 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.597689 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs"] Apr 16 16:22:52.599103 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.599080 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:22:52.599251 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.599218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:22:52.600292 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.600262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:22:52.605094 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.605075 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:22:52.609628 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.609606 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2cpdc"] Apr 16 16:22:52.611279 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.611261 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8nvl6"] Apr 16 16:22:52.611382 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.611369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.616586 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.616564 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:22:52.616713 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.616598 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:22:52.616713 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.616603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:22:52.616873 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.616857 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:22:52.623412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.623386 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2cpdc"] Apr 16 16:22:52.731135 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.731135 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.731135 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-tmp\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nnjd\" (UniqueName: \"kubernetes.io/projected/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-kube-api-access-2nnjd\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a69f824-1cbc-4f20-81e2-073bc726a590-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/435f0085-97d9-46f8-973a-ffb39094715d-config-volume\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.731391 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/435f0085-97d9-46f8-973a-ffb39094715d-tmp-dir\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rc8\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vj2\" (UniqueName: \"kubernetes.io/projected/88ef2e06-1249-4fad-a9a8-b7c519a683ce-kube-api-access-p8vj2\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwrd\" (UniqueName: \"kubernetes.io/projected/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-kube-api-access-zvwrd\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-klusterlet-config\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmmx\" (UniqueName: \"kubernetes.io/projected/435f0085-97d9-46f8-973a-ffb39094715d-kube-api-access-6rmmx\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.731775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.732108 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.732108 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.732108 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.732108 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdst9\" (UniqueName: \"kubernetes.io/projected/4a69f824-1cbc-4f20-81e2-073bc726a590-kube-api-access-bdst9\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.732108 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.731914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.832594 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a69f824-1cbc-4f20-81e2-073bc726a590-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.832594 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.832716 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/435f0085-97d9-46f8-973a-ffb39094715d-config-volume\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.832777 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.332758418 +0000 UTC m=+34.063996812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.832846 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/435f0085-97d9-46f8-973a-ffb39094715d-tmp-dir\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832855 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rc8\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vj2\" (UniqueName: \"kubernetes.io/projected/88ef2e06-1249-4fad-a9a8-b7c519a683ce-kube-api-access-p8vj2\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwrd\" (UniqueName: \"kubernetes.io/projected/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-kube-api-access-zvwrd\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.832986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-klusterlet-config\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmmx\" (UniqueName: \"kubernetes.io/projected/435f0085-97d9-46f8-973a-ffb39094715d-kube-api-access-6rmmx\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.833196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833308 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833363 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret podName:9adfc688-8cd4-4e19-b964-829c6ec785ff nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.833344961 +0000 UTC m=+34.564583360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret") pod "global-pull-secret-syncer-7d55g" (UID: "9adfc688-8cd4-4e19-b964-829c6ec785ff") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdst9\" (UniqueName: \"kubernetes.io/projected/4a69f824-1cbc-4f20-81e2-073bc726a590-kube-api-access-bdst9\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-tmp\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nnjd\" (UniqueName: \"kubernetes.io/projected/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-kube-api-access-2nnjd\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.833742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833819 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:22:52.833895 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833859 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.333843582 +0000 UTC m=+34.065081976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:22:52.834169 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833918 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:22:52.834169 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833930 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:22:52.834169 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:52.833960 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:53.333950599 +0000 UTC m=+34.065188997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:22:52.835013 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.834978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/435f0085-97d9-46f8-973a-ffb39094715d-config-volume\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.835429 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.835402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.836293 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.836266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.836553 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.836530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/435f0085-97d9-46f8-973a-ffb39094715d-tmp-dir\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.837682 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.837658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.837915 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.837890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.838053 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.838035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.838277 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.838185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-tmp\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.838277 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.838201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-klusterlet-config\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.838277 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.838245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a69f824-1cbc-4f20-81e2-073bc726a590-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.838923 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.838903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.839113 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.839094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-hub\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.840778 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.840758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.842296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.842278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88ef2e06-1249-4fad-a9a8-b7c519a683ce-ca\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.844474 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.844454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nnjd\" (UniqueName: \"kubernetes.io/projected/4dc164ff-ee4d-4b73-8d22-47f7263b67e4-kube-api-access-2nnjd\") pod \"klusterlet-addon-workmgr-75c6bf8858-hbdgs\" (UID: \"4dc164ff-ee4d-4b73-8d22-47f7263b67e4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:52.844818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.844797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmmx\" (UniqueName: \"kubernetes.io/projected/435f0085-97d9-46f8-973a-ffb39094715d-kube-api-access-6rmmx\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:52.848775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.848102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdst9\" (UniqueName: \"kubernetes.io/projected/4a69f824-1cbc-4f20-81e2-073bc726a590-kube-api-access-bdst9\") pod \"managed-serviceaccount-addon-agent-56748dbf97-rckmd\" (UID: \"4a69f824-1cbc-4f20-81e2-073bc726a590\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.848775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.848367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.848775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.848468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwrd\" (UniqueName: \"kubernetes.io/projected/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-kube-api-access-zvwrd\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:52.850024 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.850007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vj2\" (UniqueName: \"kubernetes.io/projected/88ef2e06-1249-4fad-a9a8-b7c519a683ce-kube-api-access-p8vj2\") pod \"cluster-proxy-proxy-agent-6dcdd7968c-jrnbp\" (UID: \"88ef2e06-1249-4fad-a9a8-b7c519a683ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.850373 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.850357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rc8\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:52.875465 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.875450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:52.875465 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.875462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:52.878512 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.878491 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:22:52.879104 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.879081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:22:52.879203 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.879136 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:22:52.879427 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.879411 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:22:52.879857 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.879827 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wtfd7\"" Apr 16 16:22:52.898576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.898559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" Apr 16 16:22:52.914240 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.914221 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:22:52.933308 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:52.933020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:22:53.012598 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.012533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:53.015430 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.015412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:22:53.338597 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.338509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:53.338597 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.338554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.338603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338716 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338729 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338739 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338725 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338797 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:54.338776279 +0000 UTC m=+35.070014690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338821 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:54.33880503 +0000 UTC m=+35.070043423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:22:53.339297 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.338840 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:22:54.338831002 +0000 UTC m=+35.070069439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:22:53.541333 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.541297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:22:53.541502 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.541450 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:22:53.541543 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:53.541523 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:25.541505014 +0000 UTC m=+66.272743415 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : secret "metrics-daemon-secret" not found Apr 16 16:22:53.642178 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.642088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:53.645101 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.645076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wztjn\" (UniqueName: \"kubernetes.io/projected/0dcde09b-f2df-44aa-b593-4fc117b2e8f7-kube-api-access-wztjn\") pod \"network-check-target-58j4z\" (UID: \"0dcde09b-f2df-44aa-b593-4fc117b2e8f7\") " pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:53.786834 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.786800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:22:53.844698 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.844665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:53.847983 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.847952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9adfc688-8cd4-4e19-b964-829c6ec785ff-original-pull-secret\") pod \"global-pull-secret-syncer-7d55g\" (UID: \"9adfc688-8cd4-4e19-b964-829c6ec785ff\") " pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:53.923437 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:53.923368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7d55g" Apr 16 16:22:54.347901 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.347869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.347952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.347998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348049 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348062 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348080 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348087 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:22:56.348093681 +0000 UTC m=+37.079332078 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:56.348120618 +0000 UTC m=+37.079359013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:22:54.348491 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:54.348145 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:22:56.34813873 +0000 UTC m=+37.079377124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:22:54.981333 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.981175 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-58j4z"] Apr 16 16:22:54.990465 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.988430 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7d55g"] Apr 16 16:22:54.990465 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.990179 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd"] Apr 16 16:22:54.991692 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.991662 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs"] Apr 16 16:22:54.992329 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:54.992310 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp"] Apr 16 16:22:55.044966 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:55.044933 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dcde09b_f2df_44aa_b593_4fc117b2e8f7.slice/crio-13a535ac9f3c833b14adb67098a1e17cf3e2ed7ff952a764ce2fdcc4603c4736 WatchSource:0}: Error finding container 13a535ac9f3c833b14adb67098a1e17cf3e2ed7ff952a764ce2fdcc4603c4736: Status 404 returned error can't find the container with id 13a535ac9f3c833b14adb67098a1e17cf3e2ed7ff952a764ce2fdcc4603c4736 Apr 16 16:22:55.045828 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:55.045800 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9adfc688_8cd4_4e19_b964_829c6ec785ff.slice/crio-acca71305eb49e684a298081bde4df8f2509d35cad1335a97fff2b25651a631d WatchSource:0}: Error finding container acca71305eb49e684a298081bde4df8f2509d35cad1335a97fff2b25651a631d: Status 404 returned error can't find the container with id acca71305eb49e684a298081bde4df8f2509d35cad1335a97fff2b25651a631d Apr 16 16:22:55.047065 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:55.047032 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a69f824_1cbc_4f20_81e2_073bc726a590.slice/crio-bb8423074b0687a5f791b8acfbfff4aa0016d2f02faa03e396ed5f3c810d866d WatchSource:0}: Error finding container bb8423074b0687a5f791b8acfbfff4aa0016d2f02faa03e396ed5f3c810d866d: Status 404 returned error can't find the container with id bb8423074b0687a5f791b8acfbfff4aa0016d2f02faa03e396ed5f3c810d866d Apr 16 16:22:55.048293 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:55.047664 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ef2e06_1249_4fad_a9a8_b7c519a683ce.slice/crio-22c1126287584aaa16b77841fe1a9d9454507401c4b76789ffc5b9754786bfec WatchSource:0}: Error finding container 22c1126287584aaa16b77841fe1a9d9454507401c4b76789ffc5b9754786bfec: Status 404 returned error can't find the container with id 22c1126287584aaa16b77841fe1a9d9454507401c4b76789ffc5b9754786bfec Apr 16 16:22:55.048448 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:22:55.048297 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc164ff_ee4d_4b73_8d22_47f7263b67e4.slice/crio-1c215fe6f255ce71ea67d8478b33dfac527932d5fbf0c1e1fb71a11121ff054b WatchSource:0}: Error finding container 1c215fe6f255ce71ea67d8478b33dfac527932d5fbf0c1e1fb71a11121ff054b: Status 404 returned error can't find the container with id 1c215fe6f255ce71ea67d8478b33dfac527932d5fbf0c1e1fb71a11121ff054b Apr 16 16:22:56.030557 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.029551 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="90acca348d4fec1a03ed17a7f4f53aa80072422352859f1453bf20de10d5565d" exitCode=0 Apr 16 16:22:56.030557 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.029662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"90acca348d4fec1a03ed17a7f4f53aa80072422352859f1453bf20de10d5565d"} Apr 16 16:22:56.033601 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.033426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" event={"ID":"4dc164ff-ee4d-4b73-8d22-47f7263b67e4","Type":"ContainerStarted","Data":"1c215fe6f255ce71ea67d8478b33dfac527932d5fbf0c1e1fb71a11121ff054b"} Apr 16 16:22:56.037214 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.037153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-58j4z" event={"ID":"0dcde09b-f2df-44aa-b593-4fc117b2e8f7","Type":"ContainerStarted","Data":"13a535ac9f3c833b14adb67098a1e17cf3e2ed7ff952a764ce2fdcc4603c4736"} Apr 16 16:22:56.041419 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.041370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7d55g" event={"ID":"9adfc688-8cd4-4e19-b964-829c6ec785ff","Type":"ContainerStarted","Data":"acca71305eb49e684a298081bde4df8f2509d35cad1335a97fff2b25651a631d"} Apr 16 16:22:56.042738 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.042689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" event={"ID":"4a69f824-1cbc-4f20-81e2-073bc726a590","Type":"ContainerStarted","Data":"bb8423074b0687a5f791b8acfbfff4aa0016d2f02faa03e396ed5f3c810d866d"} Apr 16 16:22:56.048484 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.048460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerStarted","Data":"22c1126287584aaa16b77841fe1a9d9454507401c4b76789ffc5b9754786bfec"} Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.368195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.368263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:56.368292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.368404 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.368464 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:00.368444985 +0000 UTC m=+41.099683384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.370827 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.370845 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.370903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:00.370879188 +0000 UTC m=+41.102117581 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.370984 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:22:56.371107 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:22:56.371022 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:00.371008813 +0000 UTC m=+41.102247209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:22:57.056968 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:57.056929 2577 generic.go:358] "Generic (PLEG): container finished" podID="566f5317-730d-4bea-9936-998ff669835f" containerID="2cc57f3f442dfc7a36322e139070a0f5dceec08970f9bd96aeb98c70ec250aa2" exitCode=0 Apr 16 16:22:57.057393 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:57.057016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerDied","Data":"2cc57f3f442dfc7a36322e139070a0f5dceec08970f9bd96aeb98c70ec250aa2"} Apr 16 16:22:58.070373 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:58.070332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7fht" event={"ID":"566f5317-730d-4bea-9936-998ff669835f","Type":"ContainerStarted","Data":"2539eaab8a7c96e1324227313800019a392fde0a18cf4c893192f163bd3cf464"} Apr 16 16:22:58.106331 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:22:58.106282 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b7fht" podStartSLOduration=5.685568963 podStartE2EDuration="38.106268986s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="2026-04-16 16:22:22.672383876 +0000 UTC m=+3.403622270" lastFinishedPulling="2026-04-16 16:22:55.093083898 +0000 UTC m=+35.824322293" observedRunningTime="2026-04-16 16:22:58.105214109 +0000 UTC m=+38.836452526" watchObservedRunningTime="2026-04-16 16:22:58.106268986 +0000 UTC m=+38.837507401" Apr 16 16:23:00.401161 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:00.400923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:00.401211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401086 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:00.401282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401311 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:08.401287001 +0000 UTC m=+49.132525404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401368 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401399 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401416 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:08.401406631 +0000 UTC m=+49.132645030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:23:00.401616 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:00.401450 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:08.401439512 +0000 UTC m=+49.132677906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:23:06.088088 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.088046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerStarted","Data":"534df71eaabf78a4298ca6eb86d768457f31ca7b67749083a2d5321d2a0b671d"} Apr 16 16:23:06.089291 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.089264 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" event={"ID":"4dc164ff-ee4d-4b73-8d22-47f7263b67e4","Type":"ContainerStarted","Data":"54a21a732a794530761aa5971aba5ab6afc8fc088cbfd83a084e6ddf583d7709"} Apr 16 16:23:06.089477 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.089454 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:23:06.090736 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.090687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-58j4z" event={"ID":"0dcde09b-f2df-44aa-b593-4fc117b2e8f7","Type":"ContainerStarted","Data":"fc4f61603ad003b5793e5f034053a47b611b8114561c6de5a85cabca780183c3"} Apr 16 16:23:06.090833 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.090787 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:23:06.091304 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.091276 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:23:06.091951 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.091933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7d55g" event={"ID":"9adfc688-8cd4-4e19-b964-829c6ec785ff","Type":"ContainerStarted","Data":"71f572da9191c6a051131e1240886c474b0b9b5523cb5d75fe9c3fe850c23240"} Apr 16 16:23:06.093144 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.093128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" event={"ID":"4a69f824-1cbc-4f20-81e2-073bc726a590","Type":"ContainerStarted","Data":"3186ad746bc9f3c26aa3d8778be81dbc69092a15a6abf53910b6ffa223a68e68"} Apr 16 16:23:06.106815 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.106781 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" podStartSLOduration=29.161293019 podStartE2EDuration="39.106765044s" podCreationTimestamp="2026-04-16 16:22:27 +0000 UTC" firstStartedPulling="2026-04-16 16:22:55.070303662 +0000 UTC m=+35.801542062" lastFinishedPulling="2026-04-16 16:23:05.015775693 +0000 UTC m=+45.747014087" observedRunningTime="2026-04-16 16:23:06.106738939 +0000 UTC m=+46.837977354" watchObservedRunningTime="2026-04-16 16:23:06.106765044 +0000 UTC m=+46.838003462" Apr 16 16:23:06.138743 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.138697 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-58j4z" podStartSLOduration=37.185837682 podStartE2EDuration="47.138684797s" podCreationTimestamp="2026-04-16 16:22:19 +0000 UTC" firstStartedPulling="2026-04-16 16:22:55.047428019 +0000 UTC m=+35.778666416" lastFinishedPulling="2026-04-16 16:23:05.000275118 +0000 UTC m=+45.731513531" observedRunningTime="2026-04-16 16:23:06.122854125 +0000 UTC m=+46.854092543" watchObservedRunningTime="2026-04-16 16:23:06.138684797 +0000 UTC m=+46.869923213" Apr 16 16:23:06.155344 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.155039 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7d55g" podStartSLOduration=4.224765691 podStartE2EDuration="14.155024459s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:22:55.070443036 +0000 UTC m=+35.801681439" lastFinishedPulling="2026-04-16 16:23:05.00070181 +0000 UTC m=+45.731940207" observedRunningTime="2026-04-16 16:23:06.153713748 +0000 UTC m=+46.884952164" watchObservedRunningTime="2026-04-16 16:23:06.155024459 +0000 UTC m=+46.886262882" Apr 16 16:23:06.170796 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:06.170760 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" podStartSLOduration=29.240965428 podStartE2EDuration="39.170749488s" podCreationTimestamp="2026-04-16 16:22:27 +0000 UTC" firstStartedPulling="2026-04-16 16:22:55.070495565 +0000 UTC m=+35.801733959" lastFinishedPulling="2026-04-16 16:23:05.000279602 +0000 UTC m=+45.731518019" observedRunningTime="2026-04-16 16:23:06.169791958 +0000 UTC m=+46.901030376" watchObservedRunningTime="2026-04-16 16:23:06.170749488 +0000 UTC m=+46.901987895" Apr 16 16:23:08.458830 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:08.458790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:08.458866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:08.458896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.458953 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459008 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459012 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459028 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459032 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:24.459009798 +0000 UTC m=+65.190248213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459083 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:24.459067873 +0000 UTC m=+65.190306269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:23:08.459266 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:08.459099 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:24.459092365 +0000 UTC m=+65.190330759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:23:09.102361 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:09.102336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerStarted","Data":"882a2a91b81018fa572d18f912f6e8c2cbd47a4a2ca157fce3816d343ca8b864"} Apr 16 16:23:10.107019 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:10.106980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerStarted","Data":"3e6dd8a49ae1d3903c2aa28c2ab9824b3ad1b1ef2d966b46fdb15dd34ed61344"} Apr 16 16:23:10.129162 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:10.129115 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" podStartSLOduration=29.228868482 podStartE2EDuration="43.129101275s" podCreationTimestamp="2026-04-16 16:22:27 +0000 UTC" firstStartedPulling="2026-04-16 16:22:55.070336289 +0000 UTC m=+35.801574693" lastFinishedPulling="2026-04-16 16:23:08.970569093 +0000 UTC m=+49.701807486" observedRunningTime="2026-04-16 16:23:10.127771296 +0000 UTC m=+50.859009712" watchObservedRunningTime="2026-04-16 16:23:10.129101275 +0000 UTC m=+50.860339691" Apr 16 16:23:19.016128 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:19.016096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8zb4q" Apr 16 16:23:24.474251 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:24.474212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:23:24.474251 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:24.474254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:24.474299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474367 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474387 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474407 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474441 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474453 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:56.47443641 +0000 UTC m=+97.205674821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474467 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:23:56.474461341 +0000 UTC m=+97.205699735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:23:24.474686 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:24.474492 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:56.474476382 +0000 UTC m=+97.205714777 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:23:25.582482 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:25.582445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:23:25.582858 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:25.582586 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:23:25.582858 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:25.582667 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:29.582635485 +0000 UTC m=+130.313873878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : secret "metrics-daemon-secret" not found Apr 16 16:23:37.098159 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:37.098053 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-58j4z" Apr 16 16:23:56.499059 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:56.499010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:23:56.499059 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:56.499062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499144 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499156 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499174 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7cbd9b8f5-tqmnr: secret "image-registry-tls" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499203 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert podName:aab232ac-48c1-4811-b5d5-6ae9bf4d5040 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:00.499188891 +0000 UTC m=+161.230427286 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert") pod "ingress-canary-2cpdc" (UID: "aab232ac-48c1-4811-b5d5-6ae9bf4d5040") : secret "canary-serving-cert" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:23:56.499218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499226 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls podName:b1dc478b-2290-4269-aeea-056949221c87 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:00.499213097 +0000 UTC m=+161.230451491 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls") pod "image-registry-7cbd9b8f5-tqmnr" (UID: "b1dc478b-2290-4269-aeea-056949221c87") : secret "image-registry-tls" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499257 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:56.499615 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:23:56.499279 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls podName:435f0085-97d9-46f8-973a-ffb39094715d nodeName:}" failed. No retries permitted until 2026-04-16 16:25:00.499271651 +0000 UTC m=+161.230510045 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls") pod "dns-default-8nvl6" (UID: "435f0085-97d9-46f8-973a-ffb39094715d") : secret "dns-default-metrics-tls" not found Apr 16 16:24:29.630906 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:29.630863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:24:29.631498 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:29.631035 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:24:29.631498 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:29.631133 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs podName:a648a078-0f71-4a2f-a255-ad1937929932 nodeName:}" failed. No retries permitted until 2026-04-16 16:26:31.631110355 +0000 UTC m=+252.362348749 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs") pod "network-metrics-daemon-2wd9w" (UID: "a648a078-0f71-4a2f-a255-ad1937929932") : secret "metrics-daemon-secret" not found Apr 16 16:24:44.352940 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:44.352912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jchwl_9494261b-183d-4f87-ae51-80217757eafa/dns-node-resolver/0.log" Apr 16 16:24:44.954111 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:44.954081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-628rf_ff2dc9df-f5a6-47e5-9597-5d45855573cd/node-ca/0.log" Apr 16 16:24:55.622271 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:55.622226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" podUID="b1dc478b-2290-4269-aeea-056949221c87" Apr 16 16:24:55.640487 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:55.640464 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8nvl6" podUID="435f0085-97d9-46f8-973a-ffb39094715d" Apr 16 16:24:55.646650 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:55.646612 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2cpdc" podUID="aab232ac-48c1-4811-b5d5-6ae9bf4d5040" Apr 16 16:24:55.893363 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:24:55.893280 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2wd9w" podUID="a648a078-0f71-4a2f-a255-ad1937929932" Apr 16 16:24:56.354634 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:56.354602 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:24:56.354818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:56.354606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8nvl6" Apr 16 16:24:56.354818 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:24:56.354606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:00.554458 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.554428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:00.554458 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.554464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:25:00.554907 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.554502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:25:00.558241 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.558216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/435f0085-97d9-46f8-973a-ffb39094715d-metrics-tls\") pod \"dns-default-8nvl6\" (UID: \"435f0085-97d9-46f8-973a-ffb39094715d\") " pod="openshift-dns/dns-default-8nvl6" Apr 16 16:25:00.558372 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.558259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"image-registry-7cbd9b8f5-tqmnr\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:00.558430 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.558385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab232ac-48c1-4811-b5d5-6ae9bf4d5040-cert\") pod \"ingress-canary-2cpdc\" (UID: \"aab232ac-48c1-4811-b5d5-6ae9bf4d5040\") " pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:25:00.859517 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.859435 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:25:00.859517 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.859434 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:25:00.859880 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.859633 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dwggv\"" Apr 16 16:25:00.865997 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.865980 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8nvl6" Apr 16 16:25:00.866095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.866016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:00.866095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:00.866058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2cpdc" Apr 16 16:25:01.014753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.014725 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:25:01.016923 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:25:01.016894 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1dc478b_2290_4269_aeea_056949221c87.slice/crio-9690f1732da27043490475ed65cfca4df0ecb230e9aa845ea64beadfa6c81e91 WatchSource:0}: Error finding container 9690f1732da27043490475ed65cfca4df0ecb230e9aa845ea64beadfa6c81e91: Status 404 returned error can't find the container with id 9690f1732da27043490475ed65cfca4df0ecb230e9aa845ea64beadfa6c81e91 Apr 16 16:25:01.225822 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.225781 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2cpdc"] Apr 16 16:25:01.228109 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.228081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8nvl6"] Apr 16 16:25:01.230859 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:25:01.230833 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab232ac_48c1_4811_b5d5_6ae9bf4d5040.slice/crio-e48936e27080ba161e45e85a30b7b6c4139af875c9340a2fcb68c0109993de95 WatchSource:0}: Error finding container e48936e27080ba161e45e85a30b7b6c4139af875c9340a2fcb68c0109993de95: Status 404 returned error can't find the container with id e48936e27080ba161e45e85a30b7b6c4139af875c9340a2fcb68c0109993de95 Apr 16 16:25:01.231719 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:25:01.231695 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod435f0085_97d9_46f8_973a_ffb39094715d.slice/crio-ca63fbf94ce058098aa65001c90ffff8e04884ddcb6e790c84991d75ec14fb17 WatchSource:0}: Error finding container ca63fbf94ce058098aa65001c90ffff8e04884ddcb6e790c84991d75ec14fb17: Status 404 returned error can't find the container with id ca63fbf94ce058098aa65001c90ffff8e04884ddcb6e790c84991d75ec14fb17 Apr 16 16:25:01.367140 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.367103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2cpdc" event={"ID":"aab232ac-48c1-4811-b5d5-6ae9bf4d5040","Type":"ContainerStarted","Data":"e48936e27080ba161e45e85a30b7b6c4139af875c9340a2fcb68c0109993de95"} Apr 16 16:25:01.368445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.368415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" event={"ID":"b1dc478b-2290-4269-aeea-056949221c87","Type":"ContainerStarted","Data":"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de"} Apr 16 16:25:01.368445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.368448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" event={"ID":"b1dc478b-2290-4269-aeea-056949221c87","Type":"ContainerStarted","Data":"9690f1732da27043490475ed65cfca4df0ecb230e9aa845ea64beadfa6c81e91"} Apr 16 16:25:01.368680 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.368519 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:01.369663 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.369620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8nvl6" event={"ID":"435f0085-97d9-46f8-973a-ffb39094715d","Type":"ContainerStarted","Data":"ca63fbf94ce058098aa65001c90ffff8e04884ddcb6e790c84991d75ec14fb17"} Apr 16 16:25:01.402075 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:01.402026 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" podStartSLOduration=161.402009342 podStartE2EDuration="2m41.402009342s" podCreationTimestamp="2026-04-16 16:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:01.401511716 +0000 UTC m=+162.132750143" watchObservedRunningTime="2026-04-16 16:25:01.402009342 +0000 UTC m=+162.133247759" Apr 16 16:25:02.988578 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.988498 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2j4nh"] Apr 16 16:25:02.991788 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.991767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:02.995959 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.995880 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:25:02.995959 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.995898 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-t6bvr\"" Apr 16 16:25:02.996124 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.996082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:25:02.996311 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.996292 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:25:02.996757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:02.996738 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:25:03.029011 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.028983 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2j4nh"] Apr 16 16:25:03.075945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.075912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvw5\" (UniqueName: \"kubernetes.io/projected/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-api-access-zsvw5\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.076098 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.075979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.076153 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.076092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.076153 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.076141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-data-volume\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.076296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.076170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-crio-socket\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.176919 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.176893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvw5\" (UniqueName: \"kubernetes.io/projected/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-api-access-zsvw5\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.176949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.176983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.177009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-data-volume\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.177030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-crio-socket\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177273 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.177114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-crio-socket\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177390 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.177367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-data-volume\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.177627 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.177599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.179812 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.179784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.208588 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.208565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvw5\" (UniqueName: \"kubernetes.io/projected/8cfac8c1-5283-430f-85c6-be8d1e0f94cc-kube-api-access-zsvw5\") pod \"insights-runtime-extractor-2j4nh\" (UID: \"8cfac8c1-5283-430f-85c6-be8d1e0f94cc\") " pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.304151 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.304044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2j4nh" Apr 16 16:25:03.495738 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:03.495521 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2j4nh"] Apr 16 16:25:03.500477 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:25:03.500447 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfac8c1_5283_430f_85c6_be8d1e0f94cc.slice/crio-7a268dea02b3adabea1c4d063ff0754f00be804e29e993a52d107494e613d372 WatchSource:0}: Error finding container 7a268dea02b3adabea1c4d063ff0754f00be804e29e993a52d107494e613d372: Status 404 returned error can't find the container with id 7a268dea02b3adabea1c4d063ff0754f00be804e29e993a52d107494e613d372 Apr 16 16:25:04.378380 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.378353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8nvl6" event={"ID":"435f0085-97d9-46f8-973a-ffb39094715d","Type":"ContainerStarted","Data":"7b304358281c9a7592cdb484409008ec209b19e737b71c5a2e0c9c235869cff1"} Apr 16 16:25:04.378731 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.378390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8nvl6" event={"ID":"435f0085-97d9-46f8-973a-ffb39094715d","Type":"ContainerStarted","Data":"86dce7e3e5a7dfc45c9b5b27cda52cf29a3591ae45d10cc4fd0a016b0f94e834"} Apr 16 16:25:04.378731 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.378450 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8nvl6" Apr 16 16:25:04.379605 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.379581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j4nh" event={"ID":"8cfac8c1-5283-430f-85c6-be8d1e0f94cc","Type":"ContainerStarted","Data":"7e471bd5a72bd3885747e68fb25637af57671ccfd6278c27074e7a6ba1b7dc08"} Apr 16 16:25:04.379698 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.379614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j4nh" event={"ID":"8cfac8c1-5283-430f-85c6-be8d1e0f94cc","Type":"ContainerStarted","Data":"7a268dea02b3adabea1c4d063ff0754f00be804e29e993a52d107494e613d372"} Apr 16 16:25:04.380677 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.380655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2cpdc" event={"ID":"aab232ac-48c1-4811-b5d5-6ae9bf4d5040","Type":"ContainerStarted","Data":"f4d3e9e5b6f179bac165652808e5e873f4e7de72ceeaf00524ce8f0f97de706c"} Apr 16 16:25:04.401423 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.401387 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8nvl6" podStartSLOduration=130.286847445 podStartE2EDuration="2m12.401376513s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:25:01.233985612 +0000 UTC m=+161.965224007" lastFinishedPulling="2026-04-16 16:25:03.348514666 +0000 UTC m=+164.079753075" observedRunningTime="2026-04-16 16:25:04.401134159 +0000 UTC m=+165.132372576" watchObservedRunningTime="2026-04-16 16:25:04.401376513 +0000 UTC m=+165.132614929" Apr 16 16:25:04.422248 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:04.422213 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2cpdc" podStartSLOduration=130.307601701 podStartE2EDuration="2m12.42220183s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="2026-04-16 16:25:01.232815878 +0000 UTC m=+161.964054279" lastFinishedPulling="2026-04-16 16:25:03.347415998 +0000 UTC m=+164.078654408" observedRunningTime="2026-04-16 16:25:04.421830405 +0000 UTC m=+165.153068822" watchObservedRunningTime="2026-04-16 16:25:04.42220183 +0000 UTC m=+165.153440245" Apr 16 16:25:05.384577 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.384550 2577 generic.go:358] "Generic (PLEG): container finished" podID="4dc164ff-ee4d-4b73-8d22-47f7263b67e4" containerID="54a21a732a794530761aa5971aba5ab6afc8fc088cbfd83a084e6ddf583d7709" exitCode=1 Apr 16 16:25:05.384962 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.384629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" event={"ID":"4dc164ff-ee4d-4b73-8d22-47f7263b67e4","Type":"ContainerDied","Data":"54a21a732a794530761aa5971aba5ab6afc8fc088cbfd83a084e6ddf583d7709"} Apr 16 16:25:05.385033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.385012 2577 scope.go:117] "RemoveContainer" containerID="54a21a732a794530761aa5971aba5ab6afc8fc088cbfd83a084e6ddf583d7709" Apr 16 16:25:05.385999 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.385977 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a69f824-1cbc-4f20-81e2-073bc726a590" containerID="3186ad746bc9f3c26aa3d8778be81dbc69092a15a6abf53910b6ffa223a68e68" exitCode=255 Apr 16 16:25:05.386085 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.386054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" event={"ID":"4a69f824-1cbc-4f20-81e2-073bc726a590","Type":"ContainerDied","Data":"3186ad746bc9f3c26aa3d8778be81dbc69092a15a6abf53910b6ffa223a68e68"} Apr 16 16:25:05.386403 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.386356 2577 scope.go:117] "RemoveContainer" containerID="3186ad746bc9f3c26aa3d8778be81dbc69092a15a6abf53910b6ffa223a68e68" Apr 16 16:25:05.388079 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:05.388011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j4nh" event={"ID":"8cfac8c1-5283-430f-85c6-be8d1e0f94cc","Type":"ContainerStarted","Data":"8a72c7a67473a294e68fa7af395361705c6d3a9800fc772176256a95d391c160"} Apr 16 16:25:06.089995 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:06.089958 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:25:06.395048 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:06.394954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" event={"ID":"4dc164ff-ee4d-4b73-8d22-47f7263b67e4","Type":"ContainerStarted","Data":"716eef243db88aef80018a218c1883821934958894c8e4b131933b2aa563b476"} Apr 16 16:25:06.395473 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:06.395212 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:25:06.395906 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:06.395885 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-75c6bf8858-hbdgs" Apr 16 16:25:06.397347 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:06.397323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56748dbf97-rckmd" event={"ID":"4a69f824-1cbc-4f20-81e2-073bc726a590","Type":"ContainerStarted","Data":"46577e5c94ff97fc73db583b0ed4e25dff581dd13c8502ba5899e5d47aba03b4"} Apr 16 16:25:07.401540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:07.401504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j4nh" event={"ID":"8cfac8c1-5283-430f-85c6-be8d1e0f94cc","Type":"ContainerStarted","Data":"415e12267ec25cba234da9f66622c06af7a1ce19a8e6be7dcd6b2d0bde0b81a1"} Apr 16 16:25:07.430920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:07.430873 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2j4nh" podStartSLOduration=2.526054634 podStartE2EDuration="5.430859933s" podCreationTimestamp="2026-04-16 16:25:02 +0000 UTC" firstStartedPulling="2026-04-16 16:25:03.57524293 +0000 UTC m=+164.306481332" lastFinishedPulling="2026-04-16 16:25:06.480048237 +0000 UTC m=+167.211286631" observedRunningTime="2026-04-16 16:25:07.429882449 +0000 UTC m=+168.161120865" watchObservedRunningTime="2026-04-16 16:25:07.430859933 +0000 UTC m=+168.162098348" Apr 16 16:25:09.876572 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:09.876544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:25:11.653027 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.652994 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vms6s"] Apr 16 16:25:11.656271 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.656234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.660451 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.660431 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:25:11.660451 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.660444 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:25:11.660579 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.660450 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:25:11.661143 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.661128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mp6v2\"" Apr 16 16:25:11.661324 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.661306 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:25:11.661595 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.661575 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:25:11.661711 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.661620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:25:11.740761 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-sys\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740881 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-accelerators-collector-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740881 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2j5\" (UniqueName: \"kubernetes.io/projected/aae7248d-bc69-473b-8c2d-45d55385b6a5-kube-api-access-lv2j5\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740881 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740978 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-root\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740978 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.740978 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-textfile\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.741062 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.740990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-wtmp\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.741062 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.741015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-metrics-client-ca\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841731 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-textfile\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-wtmp\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-metrics-client-ca\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-sys\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-accelerators-collector-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.841850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2j5\" (UniqueName: \"kubernetes.io/projected/aae7248d-bc69-473b-8c2d-45d55385b6a5-kube-api-access-lv2j5\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-sys\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-wtmp\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.841935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-root\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.842001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.842036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aae7248d-bc69-473b-8c2d-45d55385b6a5-root\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842096 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:25:11.842092 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:11.842415 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:25:11.842151 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls podName:aae7248d-bc69-473b-8c2d-45d55385b6a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:12.342134283 +0000 UTC m=+173.073372681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls") pod "node-exporter-vms6s" (UID: "aae7248d-bc69-473b-8c2d-45d55385b6a5") : secret "node-exporter-tls" not found Apr 16 16:25:11.842481 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.842451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-metrics-client-ca\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842568 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.842548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-accelerators-collector-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.842617 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.842586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-textfile\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.844344 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.844325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:11.850465 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:11.850442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2j5\" (UniqueName: \"kubernetes.io/projected/aae7248d-bc69-473b-8c2d-45d55385b6a5-kube-api-access-lv2j5\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:12.345371 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:12.345328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:12.345553 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:25:12.345446 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:12.345553 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:25:12.345505 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls podName:aae7248d-bc69-473b-8c2d-45d55385b6a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:13.345491962 +0000 UTC m=+174.076730357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls") pod "node-exporter-vms6s" (UID: "aae7248d-bc69-473b-8c2d-45d55385b6a5") : secret "node-exporter-tls" not found Apr 16 16:25:13.353306 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:13.353274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:13.355612 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:13.355586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aae7248d-bc69-473b-8c2d-45d55385b6a5-node-exporter-tls\") pod \"node-exporter-vms6s\" (UID: \"aae7248d-bc69-473b-8c2d-45d55385b6a5\") " pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:13.464932 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:13.464905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vms6s" Apr 16 16:25:13.472675 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:25:13.472633 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae7248d_bc69_473b_8c2d_45d55385b6a5.slice/crio-67cdf1194852129dcafa9c4f9b19d8e75c854328ebfa79193b6c93470a4e0dce WatchSource:0}: Error finding container 67cdf1194852129dcafa9c4f9b19d8e75c854328ebfa79193b6c93470a4e0dce: Status 404 returned error can't find the container with id 67cdf1194852129dcafa9c4f9b19d8e75c854328ebfa79193b6c93470a4e0dce Apr 16 16:25:14.390565 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:14.390544 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8nvl6" Apr 16 16:25:14.419802 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:14.419776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vms6s" event={"ID":"aae7248d-bc69-473b-8c2d-45d55385b6a5","Type":"ContainerStarted","Data":"67cdf1194852129dcafa9c4f9b19d8e75c854328ebfa79193b6c93470a4e0dce"} Apr 16 16:25:15.426622 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:15.426588 2577 generic.go:358] "Generic (PLEG): container finished" podID="aae7248d-bc69-473b-8c2d-45d55385b6a5" containerID="57760f066f7ed61bfaa6a28e9429df0fbd2f3eaa00532ff979b807125792b703" exitCode=0 Apr 16 16:25:15.427014 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:15.426659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vms6s" event={"ID":"aae7248d-bc69-473b-8c2d-45d55385b6a5","Type":"ContainerDied","Data":"57760f066f7ed61bfaa6a28e9429df0fbd2f3eaa00532ff979b807125792b703"} Apr 16 16:25:16.431454 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:16.431414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vms6s" event={"ID":"aae7248d-bc69-473b-8c2d-45d55385b6a5","Type":"ContainerStarted","Data":"8786c6bea16f9fa5a327882a49bc6ffe526953a03acda7d1a16604ffd572002f"} Apr 16 16:25:16.431454 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:16.431451 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vms6s" event={"ID":"aae7248d-bc69-473b-8c2d-45d55385b6a5","Type":"ContainerStarted","Data":"0fc329687db25dd3ec124d3c1fd8d80713ef7449ce46210b675219173257a7e2"} Apr 16 16:25:16.481788 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:16.481743 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vms6s" podStartSLOduration=4.571469393 podStartE2EDuration="5.481729054s" podCreationTimestamp="2026-04-16 16:25:11 +0000 UTC" firstStartedPulling="2026-04-16 16:25:13.474341538 +0000 UTC m=+174.205579936" lastFinishedPulling="2026-04-16 16:25:14.384601204 +0000 UTC m=+175.115839597" observedRunningTime="2026-04-16 16:25:16.481233705 +0000 UTC m=+177.212472122" watchObservedRunningTime="2026-04-16 16:25:16.481729054 +0000 UTC m=+177.212967509" Apr 16 16:25:22.377083 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:22.377053 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:25.121351 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:25.121316 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:25:42.915463 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:42.915402 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" podUID="88ef2e06-1249-4fad-a9a8-b7c519a683ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:25:50.140020 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.139961 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" podUID="b1dc478b-2290-4269-aeea-056949221c87" containerName="registry" containerID="cri-o://b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de" gracePeriod=30 Apr 16 16:25:50.386106 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.386085 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:50.513529 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513503 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513548 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513569 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513595 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513758 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513792 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.513931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.513838 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2rc8\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8\") pod \"b1dc478b-2290-4269-aeea-056949221c87\" (UID: \"b1dc478b-2290-4269-aeea-056949221c87\") " Apr 16 16:25:50.514106 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.514088 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:50.515133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.514397 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:50.516859 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.516833 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:50.516971 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.516858 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:50.516971 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.516878 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8" (OuterVolumeSpecName: "kube-api-access-s2rc8") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "kube-api-access-s2rc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:50.516971 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.516927 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:50.517100 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.517013 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:50.518838 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.518778 2577 generic.go:358] "Generic (PLEG): container finished" podID="b1dc478b-2290-4269-aeea-056949221c87" containerID="b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de" exitCode=0 Apr 16 16:25:50.518838 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.518812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" event={"ID":"b1dc478b-2290-4269-aeea-056949221c87","Type":"ContainerDied","Data":"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de"} Apr 16 16:25:50.518979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.518851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" event={"ID":"b1dc478b-2290-4269-aeea-056949221c87","Type":"ContainerDied","Data":"9690f1732da27043490475ed65cfca4df0ecb230e9aa845ea64beadfa6c81e91"} Apr 16 16:25:50.518979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.518869 2577 scope.go:117] "RemoveContainer" containerID="b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de" Apr 16 16:25:50.518979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.518869 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7cbd9b8f5-tqmnr" Apr 16 16:25:50.525239 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.525213 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1dc478b-2290-4269-aeea-056949221c87" (UID: "b1dc478b-2290-4269-aeea-056949221c87"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:50.530879 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.530859 2577 scope.go:117] "RemoveContainer" containerID="b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de" Apr 16 16:25:50.531133 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:25:50.531111 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de\": container with ID starting with b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de not found: ID does not exist" containerID="b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de" Apr 16 16:25:50.531213 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.531138 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de"} err="failed to get container status \"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de\": rpc error: code = NotFound desc = could not find container \"b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de\": container with ID starting with b7f22221b9b52d554531141632df0e301ae2c138fea8f982b1384e04265077de not found: ID does not exist" Apr 16 16:25:50.614540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614513 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1dc478b-2290-4269-aeea-056949221c87-ca-trust-extracted\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614538 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-registry-tls\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614549 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-image-registry-private-configuration\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614558 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-trusted-ca\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614566 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1dc478b-2290-4269-aeea-056949221c87-installation-pull-secrets\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614577 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1dc478b-2290-4269-aeea-056949221c87-registry-certificates\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614586 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-bound-sa-token\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.614753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.614595 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2rc8\" (UniqueName: \"kubernetes.io/projected/b1dc478b-2290-4269-aeea-056949221c87-kube-api-access-s2rc8\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:25:50.851685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.851661 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:25:50.855261 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:50.855241 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7cbd9b8f5-tqmnr"] Apr 16 16:25:51.878974 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:51.878936 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1dc478b-2290-4269-aeea-056949221c87" path="/var/lib/kubelet/pods/b1dc478b-2290-4269-aeea-056949221c87/volumes" Apr 16 16:25:52.915975 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:25:52.915939 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" podUID="88ef2e06-1249-4fad-a9a8-b7c519a683ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:26:02.915376 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:02.915339 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" podUID="88ef2e06-1249-4fad-a9a8-b7c519a683ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:26:02.915741 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:02.915413 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" Apr 16 16:26:02.915836 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:02.915819 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"3e6dd8a49ae1d3903c2aa28c2ab9824b3ad1b1ef2d966b46fdb15dd34ed61344"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:26:02.915876 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:02.915856 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" podUID="88ef2e06-1249-4fad-a9a8-b7c519a683ce" containerName="service-proxy" containerID="cri-o://3e6dd8a49ae1d3903c2aa28c2ab9824b3ad1b1ef2d966b46fdb15dd34ed61344" gracePeriod=30 Apr 16 16:26:03.552781 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:03.552745 2577 generic.go:358] "Generic (PLEG): container finished" podID="88ef2e06-1249-4fad-a9a8-b7c519a683ce" containerID="3e6dd8a49ae1d3903c2aa28c2ab9824b3ad1b1ef2d966b46fdb15dd34ed61344" exitCode=2 Apr 16 16:26:03.552979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:03.552812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerDied","Data":"3e6dd8a49ae1d3903c2aa28c2ab9824b3ad1b1ef2d966b46fdb15dd34ed61344"} Apr 16 16:26:03.552979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:03.552847 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6dcdd7968c-jrnbp" event={"ID":"88ef2e06-1249-4fad-a9a8-b7c519a683ce","Type":"ContainerStarted","Data":"46a4f2b0d4fe1869e1ce4979fa12c22c36a09b28a3d1ce86b3fd66eab84c5598"} Apr 16 16:26:31.681665 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:31.681598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:26:31.684067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:31.684036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648a078-0f71-4a2f-a255-ad1937929932-metrics-certs\") pod \"network-metrics-daemon-2wd9w\" (UID: \"a648a078-0f71-4a2f-a255-ad1937929932\") " pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:26:31.779891 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:31.779857 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:26:31.787903 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:31.787875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2wd9w" Apr 16 16:26:31.909117 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:31.909083 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2wd9w"] Apr 16 16:26:31.912286 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:26:31.912259 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda648a078_0f71_4a2f_a255_ad1937929932.slice/crio-a73317d2d58e9513b6df654a95155695f97d0ce377453cd2fbe5c254e9d44f36 WatchSource:0}: Error finding container a73317d2d58e9513b6df654a95155695f97d0ce377453cd2fbe5c254e9d44f36: Status 404 returned error can't find the container with id a73317d2d58e9513b6df654a95155695f97d0ce377453cd2fbe5c254e9d44f36 Apr 16 16:26:32.629043 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:32.629005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wd9w" event={"ID":"a648a078-0f71-4a2f-a255-ad1937929932","Type":"ContainerStarted","Data":"a73317d2d58e9513b6df654a95155695f97d0ce377453cd2fbe5c254e9d44f36"} Apr 16 16:26:33.633843 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:33.633806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wd9w" event={"ID":"a648a078-0f71-4a2f-a255-ad1937929932","Type":"ContainerStarted","Data":"d0739de705a3f1696dc7dc7fe31f42a628783a227f7b6875ea64918e7f8361ed"} Apr 16 16:26:33.633843 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:26:33.633841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2wd9w" event={"ID":"a648a078-0f71-4a2f-a255-ad1937929932","Type":"ContainerStarted","Data":"f097b5c1cec17d3b16dd7319bc18325fdd97618ad5a473e13c7f6a879ffaed79"} Apr 16 16:27:19.779446 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:27:19.779416 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:30:11.645685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.645618 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2wd9w" podStartSLOduration=471.625587198 podStartE2EDuration="7m52.645600653s" podCreationTimestamp="2026-04-16 16:22:19 +0000 UTC" firstStartedPulling="2026-04-16 16:26:31.914009579 +0000 UTC m=+252.645247973" lastFinishedPulling="2026-04-16 16:26:32.934023031 +0000 UTC m=+253.665261428" observedRunningTime="2026-04-16 16:26:33.65740709 +0000 UTC m=+254.388645506" watchObservedRunningTime="2026-04-16 16:30:11.645600653 +0000 UTC m=+472.376839071" Apr 16 16:30:11.646081 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.646054 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcllk"] Apr 16 16:30:11.646276 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.646264 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1dc478b-2290-4269-aeea-056949221c87" containerName="registry" Apr 16 16:30:11.646311 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.646278 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dc478b-2290-4269-aeea-056949221c87" containerName="registry" Apr 16 16:30:11.646353 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.646342 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1dc478b-2290-4269-aeea-056949221c87" containerName="registry" Apr 16 16:30:11.648990 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.648976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.651571 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.651551 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:30:11.651803 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.651785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:30:11.651864 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.651852 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:30:11.652883 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.652865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:30:11.652945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.652865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5xrck\"" Apr 16 16:30:11.652945 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.652868 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:30:11.660166 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.660146 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcllk"] Apr 16 16:30:11.745295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.745257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfr42\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-kube-api-access-zfr42\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.745295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.745291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.745503 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.745317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-cabundle0\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.846220 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.846183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfr42\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-kube-api-access-zfr42\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.846357 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.846234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.846357 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.846257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-cabundle0\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.846357 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:11.846342 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:30:11.846473 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:11.846360 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:30:11.846473 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:11.846369 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcllk: references non-existent secret key: ca.crt Apr 16 16:30:11.846473 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:11.846416 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates podName:a1f0f98f-3513-43f8-9f40-c645f1fe02b8 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.346402162 +0000 UTC m=+473.077640556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates") pod "keda-operator-ffbb595cb-gcllk" (UID: "a1f0f98f-3513-43f8-9f40-c645f1fe02b8") : references non-existent secret key: ca.crt Apr 16 16:30:11.846922 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.846903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-cabundle0\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:11.865146 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:11.865117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfr42\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-kube-api-access-zfr42\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:12.318336 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.318304 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-4bbg2"] Apr 16 16:30:12.321423 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.321400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.324734 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.324715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:30:12.332190 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.332161 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4bbg2"] Apr 16 16:30:12.350590 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.350566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:12.350706 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:12.350692 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:30:12.350706 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:12.350705 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:30:12.350793 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:12.350714 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcllk: references non-existent secret key: ca.crt Apr 16 16:30:12.350793 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:12.350766 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates podName:a1f0f98f-3513-43f8-9f40-c645f1fe02b8 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:13.350753283 +0000 UTC m=+474.081991677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates") pod "keda-operator-ffbb595cb-gcllk" (UID: "a1f0f98f-3513-43f8-9f40-c645f1fe02b8") : references non-existent secret key: ca.crt Apr 16 16:30:12.451714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.451679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptt4\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-kube-api-access-xptt4\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.451714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.451722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-certificates\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.552854 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.552811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xptt4\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-kube-api-access-xptt4\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.552854 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.552860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-certificates\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.555466 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.555443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-certificates\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.562575 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.562552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptt4\" (UniqueName: \"kubernetes.io/projected/a1aec165-2199-4109-8d59-302b19491d7a-kube-api-access-xptt4\") pod \"keda-admission-cf49989db-4bbg2\" (UID: \"a1aec165-2199-4109-8d59-302b19491d7a\") " pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.631598 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.631545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:12.750184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.750147 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4bbg2"] Apr 16 16:30:12.753070 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:30:12.753043 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1aec165_2199_4109_8d59_302b19491d7a.slice/crio-fa454f728d2418c04a3a11766020ae26f554854dc308c19ff19ecced81a90661 WatchSource:0}: Error finding container fa454f728d2418c04a3a11766020ae26f554854dc308c19ff19ecced81a90661: Status 404 returned error can't find the container with id fa454f728d2418c04a3a11766020ae26f554854dc308c19ff19ecced81a90661 Apr 16 16:30:12.754232 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:12.754216 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:30:13.165195 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:13.165157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4bbg2" event={"ID":"a1aec165-2199-4109-8d59-302b19491d7a","Type":"ContainerStarted","Data":"fa454f728d2418c04a3a11766020ae26f554854dc308c19ff19ecced81a90661"} Apr 16 16:30:13.358124 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:13.358087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:13.358250 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:13.358219 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:30:13.358250 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:13.358235 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:30:13.358250 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:13.358243 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcllk: references non-existent secret key: ca.crt Apr 16 16:30:13.358356 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:13.358297 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates podName:a1f0f98f-3513-43f8-9f40-c645f1fe02b8 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:15.358283621 +0000 UTC m=+476.089522015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates") pod "keda-operator-ffbb595cb-gcllk" (UID: "a1f0f98f-3513-43f8-9f40-c645f1fe02b8") : references non-existent secret key: ca.crt Apr 16 16:30:15.172302 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:15.172273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4bbg2" event={"ID":"a1aec165-2199-4109-8d59-302b19491d7a","Type":"ContainerStarted","Data":"44d5a061b552564ecc0e54501a4adb1665b16ffa4288a87ca357cbefba5b428d"} Apr 16 16:30:15.172635 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:15.172413 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:15.190281 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:15.190238 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-4bbg2" podStartSLOduration=0.866545687 podStartE2EDuration="3.190224643s" podCreationTimestamp="2026-04-16 16:30:12 +0000 UTC" firstStartedPulling="2026-04-16 16:30:12.754336876 +0000 UTC m=+473.485575271" lastFinishedPulling="2026-04-16 16:30:15.078015832 +0000 UTC m=+475.809254227" observedRunningTime="2026-04-16 16:30:15.18949675 +0000 UTC m=+475.920735165" watchObservedRunningTime="2026-04-16 16:30:15.190224643 +0000 UTC m=+475.921463081" Apr 16 16:30:15.376166 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:15.376077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:15.376311 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:15.376191 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:30:15.376311 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:15.376203 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:30:15.376311 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:15.376212 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-gcllk: references non-existent secret key: ca.crt Apr 16 16:30:15.376311 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:30:15.376267 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates podName:a1f0f98f-3513-43f8-9f40-c645f1fe02b8 nodeName:}" failed. No retries permitted until 2026-04-16 16:30:19.376247599 +0000 UTC m=+480.107486004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates") pod "keda-operator-ffbb595cb-gcllk" (UID: "a1f0f98f-3513-43f8-9f40-c645f1fe02b8") : references non-existent secret key: ca.crt Apr 16 16:30:19.404980 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:19.404940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:19.407573 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:19.407550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1f0f98f-3513-43f8-9f40-c645f1fe02b8-certificates\") pod \"keda-operator-ffbb595cb-gcllk\" (UID: \"a1f0f98f-3513-43f8-9f40-c645f1fe02b8\") " pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:19.458558 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:19.458532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:19.576038 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:19.576008 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-gcllk"] Apr 16 16:30:19.579971 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:30:19.579934 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f0f98f_3513_43f8_9f40_c645f1fe02b8.slice/crio-580d5c5f5159f92f116f6caefb422d364078680c699c439a8930b2aa56e4717c WatchSource:0}: Error finding container 580d5c5f5159f92f116f6caefb422d364078680c699c439a8930b2aa56e4717c: Status 404 returned error can't find the container with id 580d5c5f5159f92f116f6caefb422d364078680c699c439a8930b2aa56e4717c Apr 16 16:30:20.186151 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:20.186110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" event={"ID":"a1f0f98f-3513-43f8-9f40-c645f1fe02b8","Type":"ContainerStarted","Data":"580d5c5f5159f92f116f6caefb422d364078680c699c439a8930b2aa56e4717c"} Apr 16 16:30:25.205160 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:25.205127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" event={"ID":"a1f0f98f-3513-43f8-9f40-c645f1fe02b8","Type":"ContainerStarted","Data":"dded9eda388b138db9a2270786918c5f45802264f5a5156d903f83606072ec39"} Apr 16 16:30:25.205532 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:25.205282 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:30:25.222076 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:25.222020 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" podStartSLOduration=9.59179065 podStartE2EDuration="14.222004346s" podCreationTimestamp="2026-04-16 16:30:11 +0000 UTC" firstStartedPulling="2026-04-16 16:30:19.581267072 +0000 UTC m=+480.312505467" lastFinishedPulling="2026-04-16 16:30:24.211480765 +0000 UTC m=+484.942719163" observedRunningTime="2026-04-16 16:30:25.221195725 +0000 UTC m=+485.952434142" watchObservedRunningTime="2026-04-16 16:30:25.222004346 +0000 UTC m=+485.953242764" Apr 16 16:30:36.177169 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:36.177142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-4bbg2" Apr 16 16:30:46.210599 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:30:46.210569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-gcllk" Apr 16 16:31:39.359293 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.359214 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-5lvdv"] Apr 16 16:31:39.362117 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.362102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.364656 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.364622 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vfr2l\"" Apr 16 16:31:39.364739 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.364625 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 16:31:39.365602 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.365578 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 16:31:39.376662 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.373439 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5lvdv"] Apr 16 16:31:39.444078 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.444056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fq7c\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-kube-api-access-6fq7c\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.444168 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.444097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-bound-sa-token\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.544358 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.544330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fq7c\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-kube-api-access-6fq7c\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.544477 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.544374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-bound-sa-token\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.552661 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.552623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-bound-sa-token\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.552810 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.552794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fq7c\" (UniqueName: \"kubernetes.io/projected/b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040-kube-api-access-6fq7c\") pod \"cert-manager-759f64656b-5lvdv\" (UID: \"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040\") " pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.671059 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.670995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-5lvdv" Apr 16 16:31:39.788006 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:39.787978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-5lvdv"] Apr 16 16:31:39.791165 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:31:39.791136 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b6fe3d_b2ea_4d90_96a7_5c1c8330e040.slice/crio-75529b3da736eb3f8826131caeb6840bcaf00923c0dcc2da3bf3a302ab06687c WatchSource:0}: Error finding container 75529b3da736eb3f8826131caeb6840bcaf00923c0dcc2da3bf3a302ab06687c: Status 404 returned error can't find the container with id 75529b3da736eb3f8826131caeb6840bcaf00923c0dcc2da3bf3a302ab06687c Apr 16 16:31:40.399561 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:40.399528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5lvdv" event={"ID":"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040","Type":"ContainerStarted","Data":"75529b3da736eb3f8826131caeb6840bcaf00923c0dcc2da3bf3a302ab06687c"} Apr 16 16:31:43.409080 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:43.408998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-5lvdv" event={"ID":"b2b6fe3d-b2ea-4d90-96a7-5c1c8330e040","Type":"ContainerStarted","Data":"b89be36f7a22157ab849e8dacdbb9cece5001b6b09145f83013e99770ed7fc8a"} Apr 16 16:31:43.424773 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:43.424713 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-5lvdv" podStartSLOduration=1.264601888 podStartE2EDuration="4.424695721s" podCreationTimestamp="2026-04-16 16:31:39 +0000 UTC" firstStartedPulling="2026-04-16 16:31:39.792982711 +0000 UTC m=+560.524221104" lastFinishedPulling="2026-04-16 16:31:42.95307654 +0000 UTC m=+563.684314937" observedRunningTime="2026-04-16 16:31:43.42413079 +0000 UTC m=+564.155369205" watchObservedRunningTime="2026-04-16 16:31:43.424695721 +0000 UTC m=+564.155934138" Apr 16 16:31:55.240478 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.240441 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f"] Apr 16 16:31:55.243518 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.243502 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.246266 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.246239 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xkk9j\"" Apr 16 16:31:55.246603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.246586 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 16:31:55.246749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.246587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 16:31:55.263490 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.263462 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f"] Apr 16 16:31:55.352622 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.352590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvqp\" (UniqueName: \"kubernetes.io/projected/e37b8392-9a88-41f5-ba91-e189c583fa41-kube-api-access-vtvqp\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.352782 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.352635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e37b8392-9a88-41f5-ba91-e189c583fa41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.453950 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.453920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvqp\" (UniqueName: \"kubernetes.io/projected/e37b8392-9a88-41f5-ba91-e189c583fa41-kube-api-access-vtvqp\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.454076 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.453965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e37b8392-9a88-41f5-ba91-e189c583fa41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.456575 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.456552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e37b8392-9a88-41f5-ba91-e189c583fa41-operator-config\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.462557 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.462531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvqp\" (UniqueName: \"kubernetes.io/projected/e37b8392-9a88-41f5-ba91-e189c583fa41-kube-api-access-vtvqp\") pod \"servicemesh-operator3-55f49c5f94-rfb5f\" (UID: \"e37b8392-9a88-41f5-ba91-e189c583fa41\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.552404 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.552347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:31:55.676716 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:55.676688 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f"] Apr 16 16:31:55.680581 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:31:55.680550 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37b8392_9a88_41f5_ba91_e189c583fa41.slice/crio-f55539fe7d68c9a1922981ad9188b241c6f9986b815521fd2c9bb2c30bfa0416 WatchSource:0}: Error finding container f55539fe7d68c9a1922981ad9188b241c6f9986b815521fd2c9bb2c30bfa0416: Status 404 returned error can't find the container with id f55539fe7d68c9a1922981ad9188b241c6f9986b815521fd2c9bb2c30bfa0416 Apr 16 16:31:56.444072 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:31:56.444042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" event={"ID":"e37b8392-9a88-41f5-ba91-e189c583fa41","Type":"ContainerStarted","Data":"f55539fe7d68c9a1922981ad9188b241c6f9986b815521fd2c9bb2c30bfa0416"} Apr 16 16:32:00.458344 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:00.458308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" event={"ID":"e37b8392-9a88-41f5-ba91-e189c583fa41","Type":"ContainerStarted","Data":"4bf46ddd0a6713c2af16d9aa8a56a08d4ecb5380cea14663db3ce4f1c5d857b2"} Apr 16 16:32:00.458813 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:00.458440 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:32:00.479504 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:00.479454 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" podStartSLOduration=1.309469277 podStartE2EDuration="5.479436307s" podCreationTimestamp="2026-04-16 16:31:55 +0000 UTC" firstStartedPulling="2026-04-16 16:31:55.683118778 +0000 UTC m=+576.414357172" lastFinishedPulling="2026-04-16 16:31:59.853085806 +0000 UTC m=+580.584324202" observedRunningTime="2026-04-16 16:32:00.47770574 +0000 UTC m=+581.208944156" watchObservedRunningTime="2026-04-16 16:32:00.479436307 +0000 UTC m=+581.210674722" Apr 16 16:32:06.775356 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.775318 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:32:06.777504 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.777486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.780610 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.780587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:32:06.781561 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781540 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 16:32:06.781685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781630 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-sbds6\"" Apr 16 16:32:06.781685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781574 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 16:32:06.781685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:32:06.781836 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781666 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 16:32:06.781836 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.781587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 16:32:06.792965 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.792934 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:32:06.835054 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835192 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835192 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835310 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835310 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q67r\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835393 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.835393 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.835341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q67r\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936722 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.936722 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.936511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.937250 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.937073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.938895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.938872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.939332 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.939309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.939505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.939482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.939579 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.939539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.948848 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.948827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:06.949081 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:06.949065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q67r\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r\") pod \"istiod-openshift-gateway-7cd77c7ffd-f2zw6\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:07.086616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:07.086534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:07.211198 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:07.211174 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:32:07.214148 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:32:07.214085 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf031416a_c2fb_484c_b795_3498c001ab79.slice/crio-cd1c7d7e4fc25eb4d6fd4970890cbf5a6a3c5be57d28ca67daf5646ab1b5c62d WatchSource:0}: Error finding container cd1c7d7e4fc25eb4d6fd4970890cbf5a6a3c5be57d28ca67daf5646ab1b5c62d: Status 404 returned error can't find the container with id cd1c7d7e4fc25eb4d6fd4970890cbf5a6a3c5be57d28ca67daf5646ab1b5c62d Apr 16 16:32:07.478417 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:07.478378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" event={"ID":"f031416a-c2fb-484c-b795-3498c001ab79","Type":"ContainerStarted","Data":"cd1c7d7e4fc25eb4d6fd4970890cbf5a6a3c5be57d28ca67daf5646ab1b5c62d"} Apr 16 16:32:09.750225 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:09.750186 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:32:09.750555 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:09.750253 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:32:10.491935 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:10.491893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" event={"ID":"f031416a-c2fb-484c-b795-3498c001ab79","Type":"ContainerStarted","Data":"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d"} Apr 16 16:32:10.492236 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:10.492100 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:10.493870 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:10.493831 2577 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-f2zw6 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 16:32:10.493979 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:10.493902 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:32:10.513133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:10.513069 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" podStartSLOduration=1.981152674 podStartE2EDuration="4.51305036s" podCreationTimestamp="2026-04-16 16:32:06 +0000 UTC" firstStartedPulling="2026-04-16 16:32:07.218072112 +0000 UTC m=+587.949310510" lastFinishedPulling="2026-04-16 16:32:09.749969798 +0000 UTC m=+590.481208196" observedRunningTime="2026-04-16 16:32:10.512798906 +0000 UTC m=+591.244037322" watchObservedRunningTime="2026-04-16 16:32:10.51305036 +0000 UTC m=+591.244288778" Apr 16 16:32:11.464028 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:11.464000 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-rfb5f" Apr 16 16:32:11.495620 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:11.495590 2577 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-f2zw6 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 16:32:11.495798 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:11.495638 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:32:14.496786 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:14.496752 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:32:38.246429 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.246342 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd"] Apr 16 16:32:38.249548 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.249526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.253592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.253570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:32:38.253718 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.253702 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:32:38.253779 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.253719 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-cljgq\"" Apr 16 16:32:38.262968 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.262945 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd"] Apr 16 16:32:38.391853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.391825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d463226-474e-4925-b6b6-0a6bff73827a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.392018 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.391871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9kfj\" (UniqueName: \"kubernetes.io/projected/2d463226-474e-4925-b6b6-0a6bff73827a-kube-api-access-k9kfj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.493174 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.493139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d463226-474e-4925-b6b6-0a6bff73827a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.493319 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.493186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9kfj\" (UniqueName: \"kubernetes.io/projected/2d463226-474e-4925-b6b6-0a6bff73827a-kube-api-access-k9kfj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.493545 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.493527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d463226-474e-4925-b6b6-0a6bff73827a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.502677 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.502586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9kfj\" (UniqueName: \"kubernetes.io/projected/2d463226-474e-4925-b6b6-0a6bff73827a-kube-api-access-k9kfj\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-vbzpd\" (UID: \"2d463226-474e-4925-b6b6-0a6bff73827a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.559581 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.559542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:38.691510 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:38.691478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd"] Apr 16 16:32:38.695297 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:32:38.695269 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d463226_474e_4925_b6b6_0a6bff73827a.slice/crio-63d737f0c288610918c657160f93fda3a4928386c3b15e0d779fc2e7195c3599 WatchSource:0}: Error finding container 63d737f0c288610918c657160f93fda3a4928386c3b15e0d779fc2e7195c3599: Status 404 returned error can't find the container with id 63d737f0c288610918c657160f93fda3a4928386c3b15e0d779fc2e7195c3599 Apr 16 16:32:39.577109 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:39.577076 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" event={"ID":"2d463226-474e-4925-b6b6-0a6bff73827a","Type":"ContainerStarted","Data":"63d737f0c288610918c657160f93fda3a4928386c3b15e0d779fc2e7195c3599"} Apr 16 16:32:40.155739 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.155701 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn"] Apr 16 16:32:40.158759 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.158731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:40.165271 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.165246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-xjgbk\"" Apr 16 16:32:40.173048 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.173022 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn"] Apr 16 16:32:40.307469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.307437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrsq\" (UniqueName: \"kubernetes.io/projected/00a4139e-4585-46ab-835c-2d4eab31d934-kube-api-access-ccrsq\") pod \"limitador-operator-controller-manager-c7fb4c8d5-bkfpn\" (UID: \"00a4139e-4585-46ab-835c-2d4eab31d934\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:40.408802 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.408717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrsq\" (UniqueName: \"kubernetes.io/projected/00a4139e-4585-46ab-835c-2d4eab31d934-kube-api-access-ccrsq\") pod \"limitador-operator-controller-manager-c7fb4c8d5-bkfpn\" (UID: \"00a4139e-4585-46ab-835c-2d4eab31d934\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:40.423742 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.423698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrsq\" (UniqueName: \"kubernetes.io/projected/00a4139e-4585-46ab-835c-2d4eab31d934-kube-api-access-ccrsq\") pod \"limitador-operator-controller-manager-c7fb4c8d5-bkfpn\" (UID: \"00a4139e-4585-46ab-835c-2d4eab31d934\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:40.472688 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:40.472657 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:41.138838 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:41.138766 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn"] Apr 16 16:32:41.140840 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:32:41.140817 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a4139e_4585_46ab_835c_2d4eab31d934.slice/crio-11e6b654d1629c27a0e2406d6df66f0fac69d0cb43fc1387ef7728d269f366ab WatchSource:0}: Error finding container 11e6b654d1629c27a0e2406d6df66f0fac69d0cb43fc1387ef7728d269f366ab: Status 404 returned error can't find the container with id 11e6b654d1629c27a0e2406d6df66f0fac69d0cb43fc1387ef7728d269f366ab Apr 16 16:32:41.587139 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:41.587100 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" event={"ID":"00a4139e-4585-46ab-835c-2d4eab31d934","Type":"ContainerStarted","Data":"11e6b654d1629c27a0e2406d6df66f0fac69d0cb43fc1387ef7728d269f366ab"} Apr 16 16:32:45.604103 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.604063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" event={"ID":"2d463226-474e-4925-b6b6-0a6bff73827a","Type":"ContainerStarted","Data":"09a7aac476ed808d836e0ea488cdc8628d668895f5af6ac7b5098824521498a7"} Apr 16 16:32:45.604559 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.604226 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:32:45.605493 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.605474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" event={"ID":"00a4139e-4585-46ab-835c-2d4eab31d934","Type":"ContainerStarted","Data":"9300baaa438c505b41d7140f9ee84da05a5f736c028f3a60cb919741b35ddf19"} Apr 16 16:32:45.605601 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.605589 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:45.624111 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.624066 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" podStartSLOduration=1.679533138 podStartE2EDuration="7.624055836s" podCreationTimestamp="2026-04-16 16:32:38 +0000 UTC" firstStartedPulling="2026-04-16 16:32:38.697638097 +0000 UTC m=+619.428876493" lastFinishedPulling="2026-04-16 16:32:44.642160795 +0000 UTC m=+625.373399191" observedRunningTime="2026-04-16 16:32:45.622965151 +0000 UTC m=+626.354203566" watchObservedRunningTime="2026-04-16 16:32:45.624055836 +0000 UTC m=+626.355294251" Apr 16 16:32:45.640470 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:45.640432 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" podStartSLOduration=2.132960177 podStartE2EDuration="5.64042022s" podCreationTimestamp="2026-04-16 16:32:40 +0000 UTC" firstStartedPulling="2026-04-16 16:32:41.143208057 +0000 UTC m=+621.874446463" lastFinishedPulling="2026-04-16 16:32:44.650668102 +0000 UTC m=+625.381906506" observedRunningTime="2026-04-16 16:32:45.638709326 +0000 UTC m=+626.369947740" watchObservedRunningTime="2026-04-16 16:32:45.64042022 +0000 UTC m=+626.371658635" Apr 16 16:32:56.611029 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:56.610995 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-bkfpn" Apr 16 16:32:56.611427 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:32:56.611145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-vbzpd" Apr 16 16:33:06.934690 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.934656 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:06.936864 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.936847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:06.939214 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.939193 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 16:33:06.939378 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.939358 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-szw9h\"" Apr 16 16:33:06.947712 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.947684 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:06.994896 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.994857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffc6\" (UniqueName: \"kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:06.995055 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:06.994918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.043341 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.043310 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:07.095691 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.095624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.095878 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.095713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pffc6\" (UniqueName: \"kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.096302 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.096281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.105274 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.105246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffc6\" (UniqueName: \"kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6\") pod \"limitador-limitador-64c8f475fb-jkpnq\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.247854 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.247774 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:07.379315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.379262 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:07.381737 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:33:07.381709 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1a3ee0_85fa_43cc_af08_4bd1c0e2ae65.slice/crio-365eca32f5c872cfd8f70384b22878e8348463e847f9b49d6d604b7e58d3e183 WatchSource:0}: Error finding container 365eca32f5c872cfd8f70384b22878e8348463e847f9b49d6d604b7e58d3e183: Status 404 returned error can't find the container with id 365eca32f5c872cfd8f70384b22878e8348463e847f9b49d6d604b7e58d3e183 Apr 16 16:33:07.677666 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:07.677617 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" event={"ID":"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65","Type":"ContainerStarted","Data":"365eca32f5c872cfd8f70384b22878e8348463e847f9b49d6d604b7e58d3e183"} Apr 16 16:33:11.691123 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:11.691081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" event={"ID":"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65","Type":"ContainerStarted","Data":"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60"} Apr 16 16:33:11.691523 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:11.691145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:11.714049 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:11.713990 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" podStartSLOduration=1.602127294 podStartE2EDuration="5.713974033s" podCreationTimestamp="2026-04-16 16:33:06 +0000 UTC" firstStartedPulling="2026-04-16 16:33:07.38407486 +0000 UTC m=+648.115313258" lastFinishedPulling="2026-04-16 16:33:11.495921603 +0000 UTC m=+652.227159997" observedRunningTime="2026-04-16 16:33:11.711894697 +0000 UTC m=+652.443133113" watchObservedRunningTime="2026-04-16 16:33:11.713974033 +0000 UTC m=+652.445212490" Apr 16 16:33:22.695744 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:22.695716 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:23.642699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:23.642663 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:23.642968 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:23.642922 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" podUID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" containerName="limitador" containerID="cri-o://3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60" gracePeriod=30 Apr 16 16:33:24.192824 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.192802 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:24.211611 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.211552 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file\") pod \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " Apr 16 16:33:24.211754 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.211620 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffc6\" (UniqueName: \"kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6\") pod \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\" (UID: \"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65\") " Apr 16 16:33:24.211895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.211874 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file" (OuterVolumeSpecName: "config-file") pod "5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" (UID: "5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:33:24.213766 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.213747 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6" (OuterVolumeSpecName: "kube-api-access-pffc6") pod "5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" (UID: "5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65"). InnerVolumeSpecName "kube-api-access-pffc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:33:24.312694 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.312668 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pffc6\" (UniqueName: \"kubernetes.io/projected/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-kube-api-access-pffc6\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:33:24.312694 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.312691 2577 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65-config-file\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:33:24.730617 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.730585 2577 generic.go:358] "Generic (PLEG): container finished" podID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" containerID="3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60" exitCode=0 Apr 16 16:33:24.730776 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.730665 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" Apr 16 16:33:24.730776 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.730690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" event={"ID":"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65","Type":"ContainerDied","Data":"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60"} Apr 16 16:33:24.730776 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.730732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-jkpnq" event={"ID":"5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65","Type":"ContainerDied","Data":"365eca32f5c872cfd8f70384b22878e8348463e847f9b49d6d604b7e58d3e183"} Apr 16 16:33:24.730776 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.730751 2577 scope.go:117] "RemoveContainer" containerID="3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60" Apr 16 16:33:24.739487 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.739469 2577 scope.go:117] "RemoveContainer" containerID="3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60" Apr 16 16:33:24.739752 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:33:24.739735 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60\": container with ID starting with 3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60 not found: ID does not exist" containerID="3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60" Apr 16 16:33:24.739804 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.739761 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60"} err="failed to get container status \"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60\": rpc error: code = NotFound desc = could not find container \"3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60\": container with ID starting with 3e5308e65eb0be909a1faa85f196b6f948aaecf967c24a9152710c7ff8798f60 not found: ID does not exist" Apr 16 16:33:24.755151 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.755127 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:24.760933 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:24.760911 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-jkpnq"] Apr 16 16:33:25.879265 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:33:25.879232 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" path="/var/lib/kubelet/pods/5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65/volumes" Apr 16 16:34:35.761580 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.761544 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-s6qwr"] Apr 16 16:34:35.762113 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.762020 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" containerName="limitador" Apr 16 16:34:35.762113 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.762041 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" containerName="limitador" Apr 16 16:34:35.762113 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.762112 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d1a3ee0-85fa-43cc-af08-4bd1c0e2ae65" containerName="limitador" Apr 16 16:34:35.764912 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.764890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:35.768139 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.768119 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-f89hw\"" Apr 16 16:34:35.768259 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.768166 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 16:34:35.772382 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.772352 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-s6qwr"] Apr 16 16:34:35.937917 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.937882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41397218-8c5a-4010-a70e-f99a86f53581-tls-cert\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:35.938083 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:35.937925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgcq\" (UniqueName: \"kubernetes.io/projected/41397218-8c5a-4010-a70e-f99a86f53581-kube-api-access-nlgcq\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.038899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.038814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgcq\" (UniqueName: \"kubernetes.io/projected/41397218-8c5a-4010-a70e-f99a86f53581-kube-api-access-nlgcq\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.039038 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.038913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41397218-8c5a-4010-a70e-f99a86f53581-tls-cert\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.041346 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.041321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/41397218-8c5a-4010-a70e-f99a86f53581-tls-cert\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.047378 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.047351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgcq\" (UniqueName: \"kubernetes.io/projected/41397218-8c5a-4010-a70e-f99a86f53581-kube-api-access-nlgcq\") pod \"authorino-68bd676465-s6qwr\" (UID: \"41397218-8c5a-4010-a70e-f99a86f53581\") " pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.074132 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.074103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-s6qwr" Apr 16 16:34:36.195952 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.195866 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-s6qwr"] Apr 16 16:34:36.198351 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:34:36.198325 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41397218_8c5a_4010_a70e_f99a86f53581.slice/crio-a8966437400278f725a7732670c6e89cc5c2b80cb0a6f73dd199d027846db014 WatchSource:0}: Error finding container a8966437400278f725a7732670c6e89cc5c2b80cb0a6f73dd199d027846db014: Status 404 returned error can't find the container with id a8966437400278f725a7732670c6e89cc5c2b80cb0a6f73dd199d027846db014 Apr 16 16:34:36.959508 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:36.959478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-s6qwr" event={"ID":"41397218-8c5a-4010-a70e-f99a86f53581","Type":"ContainerStarted","Data":"a8966437400278f725a7732670c6e89cc5c2b80cb0a6f73dd199d027846db014"} Apr 16 16:34:38.970362 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:38.970316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-s6qwr" event={"ID":"41397218-8c5a-4010-a70e-f99a86f53581","Type":"ContainerStarted","Data":"78c67f59e70ca613213ba2cba600b3051b0f1792995f984f4a542dc6274e199c"} Apr 16 16:34:38.986168 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:38.986119 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-s6qwr" podStartSLOduration=2.225451272 podStartE2EDuration="3.986105617s" podCreationTimestamp="2026-04-16 16:34:35 +0000 UTC" firstStartedPulling="2026-04-16 16:34:36.199523861 +0000 UTC m=+736.930762255" lastFinishedPulling="2026-04-16 16:34:37.960178206 +0000 UTC m=+738.691416600" observedRunningTime="2026-04-16 16:34:38.984960584 +0000 UTC m=+739.716199000" watchObservedRunningTime="2026-04-16 16:34:38.986105617 +0000 UTC m=+739.717344032" Apr 16 16:34:46.705751 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.705715 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s"] Apr 16 16:34:46.708479 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.708455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726383 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbqcq\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-kube-api-access-bbqcq\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726460 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726460 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726540 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726600 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.726632 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.726604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.751697 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.751668 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s"] Apr 16 16:34:46.826963 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.826928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.826977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbqcq\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-kube-api-access-bbqcq\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.826999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.827022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.827044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.827074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.827099 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.827097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.828149 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.828119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.829708 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.829684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.829862 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.829834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.829966 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.829866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.830023 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.829991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.841001 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.840976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:46.842347 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:46.842319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbqcq\" (UniqueName: \"kubernetes.io/projected/b75fc954-79ff-4ddb-9c6f-23ec26c7fb31-kube-api-access-bbqcq\") pod \"istiod-openshift-gateway-55ff986f96-cq89s\" (UID: \"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:47.017616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.017536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:47.159545 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.159481 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s"] Apr 16 16:34:47.166768 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.166725 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:34:47.166871 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.166814 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:34:47.998512 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.998478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" event={"ID":"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31","Type":"ContainerStarted","Data":"a8e1c1f8267a8b6d01f5180995566633648b3677babd8343ce0f05440bf74db5"} Apr 16 16:34:47.998512 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.998511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" event={"ID":"b75fc954-79ff-4ddb-9c6f-23ec26c7fb31","Type":"ContainerStarted","Data":"4ff1a2a796868f4716e01b2bb5acab393dbcd392d0302ec18e93c8a48902094c"} Apr 16 16:34:47.999032 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:47.998590 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:49.004129 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.004096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" Apr 16 16:34:49.034946 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.034901 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-cq89s" podStartSLOduration=3.034886349 podStartE2EDuration="3.034886349s" podCreationTimestamp="2026-04-16 16:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:34:48.038894548 +0000 UTC m=+748.770132964" watchObservedRunningTime="2026-04-16 16:34:49.034886349 +0000 UTC m=+749.766124765" Apr 16 16:34:49.087924 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.087892 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:34:49.088238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.088191 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" containerID="cri-o://926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d" gracePeriod=30 Apr 16 16:34:49.336007 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.335984 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:34:49.346854 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346831 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.346970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346864 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q67r\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.346970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346891 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.346970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346919 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.346970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346936 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.346970 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346954 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.347201 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.346986 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token\") pod \"f031416a-c2fb-484c-b795-3498c001ab79\" (UID: \"f031416a-c2fb-484c-b795-3498c001ab79\") " Apr 16 16:34:49.347446 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.347402 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:34:49.349749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.349719 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:34:49.349749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.349733 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token" (OuterVolumeSpecName: "istio-token") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:49.349749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.349743 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts" (OuterVolumeSpecName: "cacerts") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:34:49.349913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.349761 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs" (OuterVolumeSpecName: "local-certs") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:49.349913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.349886 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r" (OuterVolumeSpecName: "kube-api-access-5q67r") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "kube-api-access-5q67r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:49.350517 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.350502 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "f031416a-c2fb-484c-b795-3498c001ab79" (UID: "f031416a-c2fb-484c-b795-3498c001ab79"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:34:49.447675 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447625 2577 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-istio-token\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447675 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447669 2577 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-cacerts\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447675 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447680 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5q67r\" (UniqueName: \"kubernetes.io/projected/f031416a-c2fb-484c-b795-3498c001ab79-kube-api-access-5q67r\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447885 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447690 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-ca-configmap\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447885 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447700 2577 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f031416a-c2fb-484c-b795-3498c001ab79-local-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447885 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447708 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-csr-dns-cert\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:49.447885 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:49.447717 2577 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f031416a-c2fb-484c-b795-3498c001ab79-istio-kubeconfig\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:34:50.006417 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.006381 2577 generic.go:358] "Generic (PLEG): container finished" podID="f031416a-c2fb-484c-b795-3498c001ab79" containerID="926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d" exitCode=0 Apr 16 16:34:50.006917 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.006479 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" Apr 16 16:34:50.006917 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.006457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" event={"ID":"f031416a-c2fb-484c-b795-3498c001ab79","Type":"ContainerDied","Data":"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d"} Apr 16 16:34:50.006917 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.006570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6" event={"ID":"f031416a-c2fb-484c-b795-3498c001ab79","Type":"ContainerDied","Data":"cd1c7d7e4fc25eb4d6fd4970890cbf5a6a3c5be57d28ca67daf5646ab1b5c62d"} Apr 16 16:34:50.006917 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.006586 2577 scope.go:117] "RemoveContainer" containerID="926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d" Apr 16 16:34:50.019296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.019279 2577 scope.go:117] "RemoveContainer" containerID="926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d" Apr 16 16:34:50.019592 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:34:50.019568 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d\": container with ID starting with 926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d not found: ID does not exist" containerID="926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d" Apr 16 16:34:50.019659 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.019606 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d"} err="failed to get container status \"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d\": rpc error: code = NotFound desc = could not find container \"926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d\": container with ID starting with 926b3fff020654c5a532372a73106bcc81f85580b16de7b3f715911df9b87f1d not found: ID does not exist" Apr 16 16:34:50.077031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.076999 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:34:50.090245 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:50.090171 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-f2zw6"] Apr 16 16:34:51.879571 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:51.879537 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f031416a-c2fb-484c-b795-3498c001ab79" path="/var/lib/kubelet/pods/f031416a-c2fb-484c-b795-3498c001ab79/volumes" Apr 16 16:34:58.207580 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.207551 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:34:58.207956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.207865 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" Apr 16 16:34:58.207956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.207877 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" Apr 16 16:34:58.207956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.207934 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f031416a-c2fb-484c-b795-3498c001ab79" containerName="discovery" Apr 16 16:34:58.210636 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.210620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.216118 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.216096 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:34:58.217236 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.217207 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:34:58.217236 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.217229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:34:58.217397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.217287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-g7mgl\"" Apr 16 16:34:58.224686 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.224664 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:34:58.265983 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.265955 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-pkj5c"] Apr 16 16:34:58.268017 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.267994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.273981 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.273959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:34:58.274114 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.274006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-b7dsj\"" Apr 16 16:34:58.278749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.278721 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pkj5c"] Apr 16 16:34:58.314148 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.314117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbc56\" (UniqueName: \"kubernetes.io/projected/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-kube-api-access-lbc56\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.314295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.314170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.314295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.314208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5xhd\" (UniqueName: \"kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.314295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.314261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-data\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.415576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.415542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.415576 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.415577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5xhd\" (UniqueName: \"kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.415839 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.415600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-data\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.415839 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.415680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbc56\" (UniqueName: \"kubernetes.io/projected/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-kube-api-access-lbc56\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.416029 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.416008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-data\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.418180 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.418158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.429513 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.429480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbc56\" (UniqueName: \"kubernetes.io/projected/da0a4793-5bf1-47fa-81c5-09bb9b89e2d9-kube-api-access-lbc56\") pod \"seaweedfs-86cc847c5c-pkj5c\" (UID: \"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9\") " pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.430082 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.430062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5xhd\" (UniqueName: \"kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd\") pod \"kserve-controller-manager-55c74f6fbc-8rd9c\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.520772 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.520700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:34:58.577899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.577868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:34:58.650010 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.649986 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:34:58.652541 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:34:58.652516 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365c439f_9cc4_4cfc_8dd3_1a5bbe6a71c3.slice/crio-a70672083c3436454aee73c6d743502bce63fa94b0704d409eb4a7b3582742b9 WatchSource:0}: Error finding container a70672083c3436454aee73c6d743502bce63fa94b0704d409eb4a7b3582742b9: Status 404 returned error can't find the container with id a70672083c3436454aee73c6d743502bce63fa94b0704d409eb4a7b3582742b9 Apr 16 16:34:58.710289 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:58.710269 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pkj5c"] Apr 16 16:34:58.711982 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:34:58.711954 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0a4793_5bf1_47fa_81c5_09bb9b89e2d9.slice/crio-638c7ac0eba8817200a41a66e456af40a7bbc0bbb0d98389546c55dfff2f7990 WatchSource:0}: Error finding container 638c7ac0eba8817200a41a66e456af40a7bbc0bbb0d98389546c55dfff2f7990: Status 404 returned error can't find the container with id 638c7ac0eba8817200a41a66e456af40a7bbc0bbb0d98389546c55dfff2f7990 Apr 16 16:34:59.035133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:59.035096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pkj5c" event={"ID":"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9","Type":"ContainerStarted","Data":"638c7ac0eba8817200a41a66e456af40a7bbc0bbb0d98389546c55dfff2f7990"} Apr 16 16:34:59.036153 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:34:59.036128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" event={"ID":"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3","Type":"ContainerStarted","Data":"a70672083c3436454aee73c6d743502bce63fa94b0704d409eb4a7b3582742b9"} Apr 16 16:35:03.057381 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.057344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" event={"ID":"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3","Type":"ContainerStarted","Data":"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce"} Apr 16 16:35:03.057857 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.057389 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:35:03.058607 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.058584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pkj5c" event={"ID":"da0a4793-5bf1-47fa-81c5-09bb9b89e2d9","Type":"ContainerStarted","Data":"824d42faf5f8caa96c4b60de884725df16692ec18ee11b19bc8ca2e40e8558c5"} Apr 16 16:35:03.058785 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.058772 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:35:03.077152 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.077112 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" podStartSLOduration=1.396768497 podStartE2EDuration="5.077101772s" podCreationTimestamp="2026-04-16 16:34:58 +0000 UTC" firstStartedPulling="2026-04-16 16:34:58.653792115 +0000 UTC m=+759.385030513" lastFinishedPulling="2026-04-16 16:35:02.334125394 +0000 UTC m=+763.065363788" observedRunningTime="2026-04-16 16:35:03.076059259 +0000 UTC m=+763.807297690" watchObservedRunningTime="2026-04-16 16:35:03.077101772 +0000 UTC m=+763.808340236" Apr 16 16:35:03.097532 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:03.097490 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-pkj5c" podStartSLOduration=1.380557085 podStartE2EDuration="5.097479102s" podCreationTimestamp="2026-04-16 16:34:58 +0000 UTC" firstStartedPulling="2026-04-16 16:34:58.71336251 +0000 UTC m=+759.444600904" lastFinishedPulling="2026-04-16 16:35:02.430284527 +0000 UTC m=+763.161522921" observedRunningTime="2026-04-16 16:35:03.09573632 +0000 UTC m=+763.826974737" watchObservedRunningTime="2026-04-16 16:35:03.097479102 +0000 UTC m=+763.828717565" Apr 16 16:35:09.064634 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:09.064599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-pkj5c" Apr 16 16:35:34.066933 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:34.066863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:35:35.802895 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.802861 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:35:35.803280 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.803087 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" podUID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" containerName="manager" containerID="cri-o://76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce" gracePeriod=10 Apr 16 16:35:35.829503 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.829475 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-xzj2m"] Apr 16 16:35:35.832012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.831998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:35.840871 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.840850 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-xzj2m"] Apr 16 16:35:35.905333 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.905305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5bt\" (UniqueName: \"kubernetes.io/projected/77bd7dcc-9785-4f85-9332-86685486c2b2-kube-api-access-rk5bt\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:35.905443 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:35.905341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77bd7dcc-9785-4f85-9332-86685486c2b2-cert\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.006670 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.006615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5bt\" (UniqueName: \"kubernetes.io/projected/77bd7dcc-9785-4f85-9332-86685486c2b2-kube-api-access-rk5bt\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.006989 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.006872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77bd7dcc-9785-4f85-9332-86685486c2b2-cert\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.009317 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.009291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77bd7dcc-9785-4f85-9332-86685486c2b2-cert\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.015615 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.015595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5bt\" (UniqueName: \"kubernetes.io/projected/77bd7dcc-9785-4f85-9332-86685486c2b2-kube-api-access-rk5bt\") pod \"kserve-controller-manager-55c74f6fbc-xzj2m\" (UID: \"77bd7dcc-9785-4f85-9332-86685486c2b2\") " pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.043389 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.043368 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:35:36.160685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.160574 2577 generic.go:358] "Generic (PLEG): container finished" podID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" containerID="76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce" exitCode=0 Apr 16 16:35:36.160685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.160661 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" Apr 16 16:35:36.160685 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.160665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" event={"ID":"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3","Type":"ContainerDied","Data":"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce"} Apr 16 16:35:36.160931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.160703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-8rd9c" event={"ID":"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3","Type":"ContainerDied","Data":"a70672083c3436454aee73c6d743502bce63fa94b0704d409eb4a7b3582742b9"} Apr 16 16:35:36.160931 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.160722 2577 scope.go:117] "RemoveContainer" containerID="76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce" Apr 16 16:35:36.170408 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.170383 2577 scope.go:117] "RemoveContainer" containerID="76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce" Apr 16 16:35:36.171008 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:35:36.170985 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce\": container with ID starting with 76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce not found: ID does not exist" containerID="76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce" Apr 16 16:35:36.171111 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.171049 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce"} err="failed to get container status \"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce\": rpc error: code = NotFound desc = could not find container \"76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce\": container with ID starting with 76e5be9a92d686ce57bf4612dac66c7d75213ae4f0700b1b2bcd78d0cc9077ce not found: ID does not exist" Apr 16 16:35:36.181908 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.181888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:36.208765 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.208737 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert\") pod \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " Apr 16 16:35:36.208906 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.208811 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5xhd\" (UniqueName: \"kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd\") pod \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\" (UID: \"365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3\") " Apr 16 16:35:36.211045 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.211019 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert" (OuterVolumeSpecName: "cert") pod "365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" (UID: "365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:35:36.211149 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.211099 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd" (OuterVolumeSpecName: "kube-api-access-k5xhd") pod "365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" (UID: "365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3"). InnerVolumeSpecName "kube-api-access-k5xhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:35:36.304434 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.304286 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-xzj2m"] Apr 16 16:35:36.307152 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:35:36.307122 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77bd7dcc_9785_4f85_9332_86685486c2b2.slice/crio-bcd430c31345fb820fb5976c339d4374b3d6b3db554f30768c8745fff9375e72 WatchSource:0}: Error finding container bcd430c31345fb820fb5976c339d4374b3d6b3db554f30768c8745fff9375e72: Status 404 returned error can't find the container with id bcd430c31345fb820fb5976c339d4374b3d6b3db554f30768c8745fff9375e72 Apr 16 16:35:36.308333 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.308317 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:35:36.309456 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.309437 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5xhd\" (UniqueName: \"kubernetes.io/projected/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-kube-api-access-k5xhd\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:35:36.309524 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.309459 2577 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3-cert\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:35:36.484214 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.484190 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:35:36.489060 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:36.489032 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-55c74f6fbc-8rd9c"] Apr 16 16:35:37.165752 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:37.165667 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" event={"ID":"77bd7dcc-9785-4f85-9332-86685486c2b2","Type":"ContainerStarted","Data":"cec3c49e6fe07104d8b04c3054cd728f19b54371996b4f8f1a5809e691799c50"} Apr 16 16:35:37.165752 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:37.165706 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" event={"ID":"77bd7dcc-9785-4f85-9332-86685486c2b2","Type":"ContainerStarted","Data":"bcd430c31345fb820fb5976c339d4374b3d6b3db554f30768c8745fff9375e72"} Apr 16 16:35:37.166153 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:37.165806 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:35:37.186446 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:37.186395 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" podStartSLOduration=1.840980079 podStartE2EDuration="2.186381854s" podCreationTimestamp="2026-04-16 16:35:35 +0000 UTC" firstStartedPulling="2026-04-16 16:35:36.308479042 +0000 UTC m=+797.039717436" lastFinishedPulling="2026-04-16 16:35:36.653880814 +0000 UTC m=+797.385119211" observedRunningTime="2026-04-16 16:35:37.185399253 +0000 UTC m=+797.916637686" watchObservedRunningTime="2026-04-16 16:35:37.186381854 +0000 UTC m=+797.917620299" Apr 16 16:35:37.880289 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:35:37.880247 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" path="/var/lib/kubelet/pods/365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3/volumes" Apr 16 16:36:08.173105 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:08.173074 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-55c74f6fbc-xzj2m" Apr 16 16:36:09.426750 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.426624 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-ksqsj"] Apr 16 16:36:09.427151 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.427134 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" containerName="manager" Apr 16 16:36:09.427193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.427154 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" containerName="manager" Apr 16 16:36:09.427251 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.427239 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="365c439f-9cc4-4cfc-8dd3-1a5bbe6a71c3" containerName="manager" Apr 16 16:36:09.430505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.430485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.433575 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.433558 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-4r6bb\"" Apr 16 16:36:09.434517 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.434493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:36:09.465390 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.465363 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ksqsj"] Apr 16 16:36:09.549341 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.549310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7q9\" (UniqueName: \"kubernetes.io/projected/dff582a4-3ff7-4498-aa36-170e46df357d-kube-api-access-rd7q9\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.549341 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.549344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff582a4-3ff7-4498-aa36-170e46df357d-tls-certs\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.649811 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.649780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7q9\" (UniqueName: \"kubernetes.io/projected/dff582a4-3ff7-4498-aa36-170e46df357d-kube-api-access-rd7q9\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.649811 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.649815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff582a4-3ff7-4498-aa36-170e46df357d-tls-certs\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.652373 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.652349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff582a4-3ff7-4498-aa36-170e46df357d-tls-certs\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.664186 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.664160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7q9\" (UniqueName: \"kubernetes.io/projected/dff582a4-3ff7-4498-aa36-170e46df357d-kube-api-access-rd7q9\") pod \"model-serving-api-86f7b4b499-ksqsj\" (UID: \"dff582a4-3ff7-4498-aa36-170e46df357d\") " pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.741012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.740939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:09.886466 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:09.886433 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ksqsj"] Apr 16 16:36:09.889231 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:36:09.889201 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff582a4_3ff7_4498_aa36_170e46df357d.slice/crio-fece736993a2a8520c88ca1084d347fc2760b34a272c22e1ea8f43b0fbd06df8 WatchSource:0}: Error finding container fece736993a2a8520c88ca1084d347fc2760b34a272c22e1ea8f43b0fbd06df8: Status 404 returned error can't find the container with id fece736993a2a8520c88ca1084d347fc2760b34a272c22e1ea8f43b0fbd06df8 Apr 16 16:36:10.265965 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:10.265934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ksqsj" event={"ID":"dff582a4-3ff7-4498-aa36-170e46df357d","Type":"ContainerStarted","Data":"fece736993a2a8520c88ca1084d347fc2760b34a272c22e1ea8f43b0fbd06df8"} Apr 16 16:36:11.273305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:11.273273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ksqsj" event={"ID":"dff582a4-3ff7-4498-aa36-170e46df357d","Type":"ContainerStarted","Data":"d7fb5d5da3ececda7984b3119f597381d1f391d8abd9156b93cd232a2c80042d"} Apr 16 16:36:11.273766 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:11.273360 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:22.281184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:22.281157 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-ksqsj" Apr 16 16:36:22.304449 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:36:22.304400 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-ksqsj" podStartSLOduration=11.985472626 podStartE2EDuration="13.304386611s" podCreationTimestamp="2026-04-16 16:36:09 +0000 UTC" firstStartedPulling="2026-04-16 16:36:09.890818495 +0000 UTC m=+830.622056890" lastFinishedPulling="2026-04-16 16:36:11.20973247 +0000 UTC m=+831.940970875" observedRunningTime="2026-04-16 16:36:11.344079662 +0000 UTC m=+832.075318072" watchObservedRunningTime="2026-04-16 16:36:22.304386611 +0000 UTC m=+843.035625069" Apr 16 16:37:01.555594 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.555553 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:01.559471 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.559447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.562527 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.562503 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:37:01.562679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.562507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-8x694\"" Apr 16 16:37:01.562679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.562605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 16:37:01.563377 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.563359 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:37:01.572365 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.572343 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:01.629695 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.629666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.629856 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.629703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2vb\" (UniqueName: \"kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.629856 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.629727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.629966 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.629870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731127 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2vb\" (UniqueName: \"kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731296 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731498 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731477 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.731557 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.731537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.733728 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.733711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.739884 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.739861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2vb\" (UniqueName: \"kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:01.892174 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:01.892059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:02.040699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:02.040666 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:02.043835 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:37:02.043804 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22303ca_5b7f_4ff6_be59_e36c009d7541.slice/crio-c1a496a2b26819a6dc36e0c12808e340688aace61059aa87b24fceb74e28cd7d WatchSource:0}: Error finding container c1a496a2b26819a6dc36e0c12808e340688aace61059aa87b24fceb74e28cd7d: Status 404 returned error can't find the container with id c1a496a2b26819a6dc36e0c12808e340688aace61059aa87b24fceb74e28cd7d Apr 16 16:37:02.430526 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:02.430487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerStarted","Data":"c1a496a2b26819a6dc36e0c12808e340688aace61059aa87b24fceb74e28cd7d"} Apr 16 16:37:05.441532 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:05.441496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerStarted","Data":"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9"} Apr 16 16:37:06.447211 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:06.447177 2577 generic.go:358] "Generic (PLEG): container finished" podID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerID="6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9" exitCode=0 Apr 16 16:37:06.447592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:06.447261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerDied","Data":"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9"} Apr 16 16:37:08.459009 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:08.458976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerStarted","Data":"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6"} Apr 16 16:37:13.480239 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:13.480194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerStarted","Data":"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0"} Apr 16 16:37:13.480751 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:13.480344 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:21.892771 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:21.892739 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:21.892771 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:21.892770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:21.895352 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:21.895330 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:21.917587 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:21.917404 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" podStartSLOduration=10.245433146 podStartE2EDuration="20.917366768s" podCreationTimestamp="2026-04-16 16:37:01 +0000 UTC" firstStartedPulling="2026-04-16 16:37:02.045691188 +0000 UTC m=+882.776929582" lastFinishedPulling="2026-04-16 16:37:12.717624811 +0000 UTC m=+893.448863204" observedRunningTime="2026-04-16 16:37:13.509437655 +0000 UTC m=+894.240676070" watchObservedRunningTime="2026-04-16 16:37:21.917366768 +0000 UTC m=+902.648605185" Apr 16 16:37:22.508593 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:22.508563 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:43.512845 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:43.512818 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:44.819742 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:44.819706 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:44.820129 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:44.820064 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="main" containerID="cri-o://1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" gracePeriod=30 Apr 16 16:37:44.820208 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:44.820112 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="tokenizer" containerID="cri-o://d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" gracePeriod=30 Apr 16 16:37:45.175947 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.175899 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:45.270067 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270034 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location\") pod \"f22303ca-5b7f-4ff6-be59-e36c009d7541\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " Apr 16 16:37:45.270244 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270081 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds\") pod \"f22303ca-5b7f-4ff6-be59-e36c009d7541\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " Apr 16 16:37:45.270244 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270104 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs\") pod \"f22303ca-5b7f-4ff6-be59-e36c009d7541\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " Apr 16 16:37:45.270244 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270129 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv2vb\" (UniqueName: \"kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb\") pod \"f22303ca-5b7f-4ff6-be59-e36c009d7541\" (UID: \"f22303ca-5b7f-4ff6-be59-e36c009d7541\") " Apr 16 16:37:45.270408 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270388 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f22303ca-5b7f-4ff6-be59-e36c009d7541" (UID: "f22303ca-5b7f-4ff6-be59-e36c009d7541"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.270814 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.270787 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f22303ca-5b7f-4ff6-be59-e36c009d7541" (UID: "f22303ca-5b7f-4ff6-be59-e36c009d7541"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.272463 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.272441 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb" (OuterVolumeSpecName: "kube-api-access-tv2vb") pod "f22303ca-5b7f-4ff6-be59-e36c009d7541" (UID: "f22303ca-5b7f-4ff6-be59-e36c009d7541"). InnerVolumeSpecName "kube-api-access-tv2vb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:37:45.272544 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.272520 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f22303ca-5b7f-4ff6-be59-e36c009d7541" (UID: "f22303ca-5b7f-4ff6-be59-e36c009d7541"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:37:45.371169 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.371081 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.371169 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.371111 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f22303ca-5b7f-4ff6-be59-e36c009d7541-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.371169 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.371125 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f22303ca-5b7f-4ff6-be59-e36c009d7541-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.371169 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.371136 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tv2vb\" (UniqueName: \"kubernetes.io/projected/f22303ca-5b7f-4ff6-be59-e36c009d7541-kube-api-access-tv2vb\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.588757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588718 2577 generic.go:358] "Generic (PLEG): container finished" podID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerID="d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" exitCode=0 Apr 16 16:37:45.588757 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588745 2577 generic.go:358] "Generic (PLEG): container finished" podID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerID="1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" exitCode=0 Apr 16 16:37:45.588969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588791 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" Apr 16 16:37:45.588969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588801 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerDied","Data":"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0"} Apr 16 16:37:45.588969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerDied","Data":"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6"} Apr 16 16:37:45.588969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v" event={"ID":"f22303ca-5b7f-4ff6-be59-e36c009d7541","Type":"ContainerDied","Data":"c1a496a2b26819a6dc36e0c12808e340688aace61059aa87b24fceb74e28cd7d"} Apr 16 16:37:45.588969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.588866 2577 scope.go:117] "RemoveContainer" containerID="d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" Apr 16 16:37:45.597008 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.596993 2577 scope.go:117] "RemoveContainer" containerID="1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" Apr 16 16:37:45.604212 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.604197 2577 scope.go:117] "RemoveContainer" containerID="6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9" Apr 16 16:37:45.611551 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.611528 2577 scope.go:117] "RemoveContainer" containerID="d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" Apr 16 16:37:45.611835 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:37:45.611808 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0\": container with ID starting with d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0 not found: ID does not exist" containerID="d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" Apr 16 16:37:45.611928 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.611845 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0"} err="failed to get container status \"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0\": rpc error: code = NotFound desc = could not find container \"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0\": container with ID starting with d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0 not found: ID does not exist" Apr 16 16:37:45.611928 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.611870 2577 scope.go:117] "RemoveContainer" containerID="1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" Apr 16 16:37:45.612083 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612067 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:45.612157 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:37:45.612143 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6\": container with ID starting with 1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6 not found: ID does not exist" containerID="1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" Apr 16 16:37:45.612193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612165 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6"} err="failed to get container status \"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6\": rpc error: code = NotFound desc = could not find container \"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6\": container with ID starting with 1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6 not found: ID does not exist" Apr 16 16:37:45.612193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612180 2577 scope.go:117] "RemoveContainer" containerID="6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9" Apr 16 16:37:45.612422 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:37:45.612407 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9\": container with ID starting with 6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9 not found: ID does not exist" containerID="6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9" Apr 16 16:37:45.612473 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612424 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9"} err="failed to get container status \"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9\": rpc error: code = NotFound desc = could not find container \"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9\": container with ID starting with 6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9 not found: ID does not exist" Apr 16 16:37:45.612473 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612435 2577 scope.go:117] "RemoveContainer" containerID="d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0" Apr 16 16:37:45.612706 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612669 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0"} err="failed to get container status \"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0\": rpc error: code = NotFound desc = could not find container \"d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0\": container with ID starting with d305a678962cdeb7706bd10e2d48e6244d9fd9cbddfd15b5c6a3978dfb62dfc0 not found: ID does not exist" Apr 16 16:37:45.612706 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612699 2577 scope.go:117] "RemoveContainer" containerID="1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6" Apr 16 16:37:45.612932 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612909 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6"} err="failed to get container status \"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6\": rpc error: code = NotFound desc = could not find container \"1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6\": container with ID starting with 1e0db419140e93e151926e5b3e3afb6e05d63868bc39f17e846b9879c9d500f6 not found: ID does not exist" Apr 16 16:37:45.612986 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.612935 2577 scope.go:117] "RemoveContainer" containerID="6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9" Apr 16 16:37:45.613164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.613146 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9"} err="failed to get container status \"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9\": rpc error: code = NotFound desc = could not find container \"6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9\": container with ID starting with 6397a83f21221f692ff2fc1e9f7991893a989d5d957ce818534f4d8559b0edc9 not found: ID does not exist" Apr 16 16:37:45.615911 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.615893 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7f6f582p8v"] Apr 16 16:37:45.880609 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:45.880578 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" path="/var/lib/kubelet/pods/f22303ca-5b7f-4ff6-be59-e36c009d7541/volumes" Apr 16 16:37:50.113858 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.113822 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114231 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="main" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114250 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="main" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114269 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="tokenizer" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114277 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="tokenizer" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114297 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="storage-initializer" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114306 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="storage-initializer" Apr 16 16:37:50.114392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114393 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="main" Apr 16 16:37:50.114774 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.114405 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f22303ca-5b7f-4ff6-be59-e36c009d7541" containerName="tokenizer" Apr 16 16:37:50.119221 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.119198 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.122550 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.122518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:37:50.122550 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.122521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 16:37:50.122749 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.122603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:37:50.122871 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.122855 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-8zwgx\"" Apr 16 16:37:50.131142 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.131119 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:37:50.308027 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.307994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg9pp\" (UniqueName: \"kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.308177 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.308034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.308177 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.308058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.308250 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.308174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.408981 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.408907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.408981 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.408958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg9pp\" (UniqueName: \"kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.408981 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.408979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.409255 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.409005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.409316 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.409278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.409316 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.409292 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.411603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.411577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.418006 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.417985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg9pp\" (UniqueName: \"kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.428786 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.428765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:50.555560 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.555515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:37:50.558106 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:37:50.558077 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5d49c0_22c9_436f_8943_f84962722ced.slice/crio-1b8e6188663aac02dbe3cda9d5d5f4074fda6d1f59da41a96172345f0f48ac30 WatchSource:0}: Error finding container 1b8e6188663aac02dbe3cda9d5d5f4074fda6d1f59da41a96172345f0f48ac30: Status 404 returned error can't find the container with id 1b8e6188663aac02dbe3cda9d5d5f4074fda6d1f59da41a96172345f0f48ac30 Apr 16 16:37:50.607768 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:50.607735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerStarted","Data":"1b8e6188663aac02dbe3cda9d5d5f4074fda6d1f59da41a96172345f0f48ac30"} Apr 16 16:37:51.612483 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:51.612456 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca5d49c0-22c9-436f-8943-f84962722ced" containerID="53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3" exitCode=0 Apr 16 16:37:51.612851 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:51.612492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerDied","Data":"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3"} Apr 16 16:37:52.617591 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:52.617556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerStarted","Data":"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c"} Apr 16 16:37:52.617591 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:52.617589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerStarted","Data":"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881"} Apr 16 16:37:52.618129 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:52.617684 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:37:52.654146 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:37:52.654097 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" podStartSLOduration=2.654080257 podStartE2EDuration="2.654080257s" podCreationTimestamp="2026-04-16 16:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:37:52.648506154 +0000 UTC m=+933.379744569" watchObservedRunningTime="2026-04-16 16:37:52.654080257 +0000 UTC m=+933.385318674" Apr 16 16:38:00.429190 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:00.429151 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:00.429678 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:00.429210 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:00.431801 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:00.431776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:00.642716 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:00.642694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:16.434988 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.434950 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:38:16.439083 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.439063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.441713 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.441689 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-v2rdb\"" Apr 16 16:38:16.441822 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.441715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 16:38:16.449608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.449586 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:38:16.487206 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.487177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.487353 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.487222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.487353 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.487327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhj6\" (UniqueName: \"kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.487472 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.487373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588448 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhj6\" (UniqueName: \"kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588802 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588869 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.588913 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.588893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.591079 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.591059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.598011 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.597989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhj6\" (UniqueName: \"kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.748864 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.748779 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:16.878526 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:16.878498 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:38:16.883123 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:38:16.883088 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc30a85f_dc2b_43e6_a639_45b2a41bd6c6.slice/crio-fb4d1751a88df3cc084c4e69e705edaa06360e31ce0247158becd772dc4da28d WatchSource:0}: Error finding container fb4d1751a88df3cc084c4e69e705edaa06360e31ce0247158becd772dc4da28d: Status 404 returned error can't find the container with id fb4d1751a88df3cc084c4e69e705edaa06360e31ce0247158becd772dc4da28d Apr 16 16:38:17.702574 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:17.702528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerStarted","Data":"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3"} Apr 16 16:38:17.703021 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:17.702582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerStarted","Data":"fb4d1751a88df3cc084c4e69e705edaa06360e31ce0247158becd772dc4da28d"} Apr 16 16:38:18.707229 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:18.707193 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerID="5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3" exitCode=0 Apr 16 16:38:18.707229 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:18.707228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerDied","Data":"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3"} Apr 16 16:38:19.713192 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:19.713157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerStarted","Data":"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9"} Apr 16 16:38:19.713192 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:19.713194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerStarted","Data":"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a"} Apr 16 16:38:19.713623 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:19.713310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:19.736921 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:19.736876 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" podStartSLOduration=3.736861762 podStartE2EDuration="3.736861762s" podCreationTimestamp="2026-04-16 16:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:38:19.735107998 +0000 UTC m=+960.466346418" watchObservedRunningTime="2026-04-16 16:38:19.736861762 +0000 UTC m=+960.468100177" Apr 16 16:38:21.647167 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:21.647139 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:26.019181 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.019136 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:38:26.020072 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.020014 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="main" containerID="cri-o://39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" gracePeriod=30 Apr 16 16:38:26.020228 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.020060 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="tokenizer" containerID="cri-o://1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" gracePeriod=30 Apr 16 16:38:26.365803 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.365782 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:26.466317 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.466285 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg9pp\" (UniqueName: \"kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp\") pod \"ca5d49c0-22c9-436f-8943-f84962722ced\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " Apr 16 16:38:26.466505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.466323 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location\") pod \"ca5d49c0-22c9-436f-8943-f84962722ced\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " Apr 16 16:38:26.466505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.466354 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs\") pod \"ca5d49c0-22c9-436f-8943-f84962722ced\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " Apr 16 16:38:26.466505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.466384 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds\") pod \"ca5d49c0-22c9-436f-8943-f84962722ced\" (UID: \"ca5d49c0-22c9-436f-8943-f84962722ced\") " Apr 16 16:38:26.466772 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.466738 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ca5d49c0-22c9-436f-8943-f84962722ced" (UID: "ca5d49c0-22c9-436f-8943-f84962722ced"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.467059 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.467033 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca5d49c0-22c9-436f-8943-f84962722ced" (UID: "ca5d49c0-22c9-436f-8943-f84962722ced"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.468562 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.468540 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ca5d49c0-22c9-436f-8943-f84962722ced" (UID: "ca5d49c0-22c9-436f-8943-f84962722ced"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:38:26.468752 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.468710 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp" (OuterVolumeSpecName: "kube-api-access-qg9pp") pod "ca5d49c0-22c9-436f-8943-f84962722ced" (UID: "ca5d49c0-22c9-436f-8943-f84962722ced"). InnerVolumeSpecName "kube-api-access-qg9pp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:38:26.566904 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.566839 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.566904 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.566864 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qg9pp\" (UniqueName: \"kubernetes.io/projected/ca5d49c0-22c9-436f-8943-f84962722ced-kube-api-access-qg9pp\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.566904 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.566874 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca5d49c0-22c9-436f-8943-f84962722ced-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.566904 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.566884 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5d49c0-22c9-436f-8943-f84962722ced-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.739127 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739088 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca5d49c0-22c9-436f-8943-f84962722ced" containerID="1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" exitCode=0 Apr 16 16:38:26.739127 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739119 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca5d49c0-22c9-436f-8943-f84962722ced" containerID="39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" exitCode=0 Apr 16 16:38:26.739342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739166 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" Apr 16 16:38:26.739342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739184 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerDied","Data":"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c"} Apr 16 16:38:26.739342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739225 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerDied","Data":"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881"} Apr 16 16:38:26.739342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2" event={"ID":"ca5d49c0-22c9-436f-8943-f84962722ced","Type":"ContainerDied","Data":"1b8e6188663aac02dbe3cda9d5d5f4074fda6d1f59da41a96172345f0f48ac30"} Apr 16 16:38:26.739342 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.739251 2577 scope.go:117] "RemoveContainer" containerID="1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" Apr 16 16:38:26.748987 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.748955 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:26.748987 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.748999 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:26.752162 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.752124 2577 scope.go:117] "RemoveContainer" containerID="39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" Apr 16 16:38:26.752162 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.752140 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:26.759491 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.759481 2577 scope.go:117] "RemoveContainer" containerID="53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3" Apr 16 16:38:26.766226 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766210 2577 scope.go:117] "RemoveContainer" containerID="1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" Apr 16 16:38:26.766434 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:38:26.766417 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c\": container with ID starting with 1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c not found: ID does not exist" containerID="1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" Apr 16 16:38:26.766503 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766441 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c"} err="failed to get container status \"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c\": rpc error: code = NotFound desc = could not find container \"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c\": container with ID starting with 1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c not found: ID does not exist" Apr 16 16:38:26.766503 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766459 2577 scope.go:117] "RemoveContainer" containerID="39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" Apr 16 16:38:26.766720 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:38:26.766705 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881\": container with ID starting with 39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881 not found: ID does not exist" containerID="39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" Apr 16 16:38:26.766778 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766725 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881"} err="failed to get container status \"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881\": rpc error: code = NotFound desc = could not find container \"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881\": container with ID starting with 39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881 not found: ID does not exist" Apr 16 16:38:26.766778 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766739 2577 scope.go:117] "RemoveContainer" containerID="53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3" Apr 16 16:38:26.766956 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:38:26.766943 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3\": container with ID starting with 53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3 not found: ID does not exist" containerID="53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3" Apr 16 16:38:26.766996 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766959 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3"} err="failed to get container status \"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3\": rpc error: code = NotFound desc = could not find container \"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3\": container with ID starting with 53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3 not found: ID does not exist" Apr 16 16:38:26.766996 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.766979 2577 scope.go:117] "RemoveContainer" containerID="1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c" Apr 16 16:38:26.767195 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.767179 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c"} err="failed to get container status \"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c\": rpc error: code = NotFound desc = could not find container \"1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c\": container with ID starting with 1cb6f5a379e0cc437059de74c9acc57ce4531cfda24365d805fd227adee5345c not found: ID does not exist" Apr 16 16:38:26.767236 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.767195 2577 scope.go:117] "RemoveContainer" containerID="39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881" Apr 16 16:38:26.767418 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.767398 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881"} err="failed to get container status \"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881\": rpc error: code = NotFound desc = could not find container \"39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881\": container with ID starting with 39712c03f02d9cb413a4ded504f119150c293bce1227dad61897529a7641c881 not found: ID does not exist" Apr 16 16:38:26.767461 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.767420 2577 scope.go:117] "RemoveContainer" containerID="53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3" Apr 16 16:38:26.767618 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.767602 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3"} err="failed to get container status \"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3\": rpc error: code = NotFound desc = could not find container \"53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3\": container with ID starting with 53e4df836f5f5fb07321e13726fbaaf31007ec0206810da621013a4234ab9aa3 not found: ID does not exist" Apr 16 16:38:26.787238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.787210 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:38:26.792887 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:26.792865 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf8456d2m2"] Apr 16 16:38:27.744408 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:27.744373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:38:27.880478 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:27.880449 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" path="/var/lib/kubelet/pods/ca5d49c0-22c9-436f-8943-f84962722ced/volumes" Apr 16 16:38:48.748164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:38:48.748136 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:40:33.083562 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.083454 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:40:33.084125 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.083874 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="main" containerID="cri-o://5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" gracePeriod=30 Apr 16 16:40:33.084267 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.084231 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="tokenizer" containerID="cri-o://641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" gracePeriod=30 Apr 16 16:40:33.436603 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.436580 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:40:33.531116 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531086 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhj6\" (UniqueName: \"kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6\") pod \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " Apr 16 16:40:33.531116 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531117 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs\") pod \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " Apr 16 16:40:33.531330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531150 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds\") pod \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " Apr 16 16:40:33.531330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531271 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location\") pod \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\" (UID: \"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6\") " Apr 16 16:40:33.531411 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531390 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" (UID: "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:40:33.531541 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531520 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:40:33.531943 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.531919 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" (UID: "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:40:33.533308 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.533278 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6" (OuterVolumeSpecName: "kube-api-access-7vhj6") pod "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" (UID: "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6"). InnerVolumeSpecName "kube-api-access-7vhj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:40:33.533396 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.533380 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" (UID: "cc30a85f-dc2b-43e6-a639-45b2a41bd6c6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:40:33.632976 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.632893 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:40:33.632976 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.632922 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:40:33.632976 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:33.632935 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vhj6\" (UniqueName: \"kubernetes.io/projected/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6-kube-api-access-7vhj6\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:40:34.137853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137773 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerID="641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" exitCode=0 Apr 16 16:40:34.137853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137798 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerID="5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" exitCode=0 Apr 16 16:40:34.137853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerDied","Data":"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9"} Apr 16 16:40:34.137853 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137846 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" Apr 16 16:40:34.138516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerDied","Data":"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a"} Apr 16 16:40:34.138516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt" event={"ID":"cc30a85f-dc2b-43e6-a639-45b2a41bd6c6","Type":"ContainerDied","Data":"fb4d1751a88df3cc084c4e69e705edaa06360e31ce0247158becd772dc4da28d"} Apr 16 16:40:34.138516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.137882 2577 scope.go:117] "RemoveContainer" containerID="641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" Apr 16 16:40:34.145598 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.145580 2577 scope.go:117] "RemoveContainer" containerID="5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" Apr 16 16:40:34.152957 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.152942 2577 scope.go:117] "RemoveContainer" containerID="5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3" Apr 16 16:40:34.158838 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.158815 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:40:34.160729 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.160713 2577 scope.go:117] "RemoveContainer" containerID="641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" Apr 16 16:40:34.160999 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:40:34.160982 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9\": container with ID starting with 641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9 not found: ID does not exist" containerID="641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" Apr 16 16:40:34.161045 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161007 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9"} err="failed to get container status \"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9\": rpc error: code = NotFound desc = could not find container \"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9\": container with ID starting with 641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9 not found: ID does not exist" Apr 16 16:40:34.161045 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161026 2577 scope.go:117] "RemoveContainer" containerID="5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" Apr 16 16:40:34.161394 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:40:34.161371 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a\": container with ID starting with 5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a not found: ID does not exist" containerID="5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" Apr 16 16:40:34.161482 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161403 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a"} err="failed to get container status \"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a\": rpc error: code = NotFound desc = could not find container \"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a\": container with ID starting with 5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a not found: ID does not exist" Apr 16 16:40:34.161482 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161420 2577 scope.go:117] "RemoveContainer" containerID="5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3" Apr 16 16:40:34.161682 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:40:34.161665 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3\": container with ID starting with 5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3 not found: ID does not exist" containerID="5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3" Apr 16 16:40:34.161741 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161685 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3"} err="failed to get container status \"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3\": rpc error: code = NotFound desc = could not find container \"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3\": container with ID starting with 5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3 not found: ID does not exist" Apr 16 16:40:34.161741 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161698 2577 scope.go:117] "RemoveContainer" containerID="641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9" Apr 16 16:40:34.161939 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161918 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9"} err="failed to get container status \"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9\": rpc error: code = NotFound desc = could not find container \"641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9\": container with ID starting with 641515f344b6e3ef73133bc5b99a97ba318ab10551d6342b4bfcba15cfc7a5e9 not found: ID does not exist" Apr 16 16:40:34.161987 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.161941 2577 scope.go:117] "RemoveContainer" containerID="5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a" Apr 16 16:40:34.162280 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.162259 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a"} err="failed to get container status \"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a\": rpc error: code = NotFound desc = could not find container \"5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a\": container with ID starting with 5b85c334d4dd28657bf7310805b8214e2b86934ee1e5173a0da5420b76f4799a not found: ID does not exist" Apr 16 16:40:34.162368 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.162285 2577 scope.go:117] "RemoveContainer" containerID="5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3" Apr 16 16:40:34.162543 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.162520 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3"} err="failed to get container status \"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3\": rpc error: code = NotFound desc = could not find container \"5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3\": container with ID starting with 5effd8340adb843cc5065ebcf09d66ebcad7614576ac5eb057222a764fddd2c3 not found: ID does not exist" Apr 16 16:40:34.165238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:34.165214 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevmnpt"] Apr 16 16:40:35.879765 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:35.879734 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" path="/var/lib/kubelet/pods/cc30a85f-dc2b-43e6-a639-45b2a41bd6c6/volumes" Apr 16 16:40:40.188278 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188234 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188549 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188561 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188572 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188577 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188586 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="tokenizer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188591 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="tokenizer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188598 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="tokenizer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188603 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="tokenizer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188612 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="storage-initializer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188617 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="storage-initializer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188625 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="storage-initializer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188630 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="storage-initializer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188692 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188701 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="tokenizer" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188709 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc30a85f-dc2b-43e6-a639-45b2a41bd6c6" containerName="main" Apr 16 16:40:40.188758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.188716 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca5d49c0-22c9-436f-8943-f84962722ced" containerName="tokenizer" Apr 16 16:40:40.191699 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.191681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.194289 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.194261 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-5rggk\"" Apr 16 16:40:40.194435 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.194329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:40:40.194526 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.194499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 16:40:40.194526 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.194516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:40:40.205520 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.205496 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:40:40.283829 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.283791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.283829 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.283826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.284005 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.283929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.284005 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.283979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpk6c\" (UniqueName: \"kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385162 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385345 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpk6c\" (UniqueName: \"kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385345 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385345 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385698 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.385698 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.385690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.387680 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.387637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.393775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.393754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpk6c\" (UniqueName: \"kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c\") pod \"custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.502164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.502103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:40.623947 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.623881 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:40:40.626110 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:40:40.626083 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6995834d_a13d_4d54_8f2b_018a2b558002.slice/crio-3486c1fe7e6fdae00a9fa786187da474f4fe5a64353bf747b14852ea68694f5e WatchSource:0}: Error finding container 3486c1fe7e6fdae00a9fa786187da474f4fe5a64353bf747b14852ea68694f5e: Status 404 returned error can't find the container with id 3486c1fe7e6fdae00a9fa786187da474f4fe5a64353bf747b14852ea68694f5e Apr 16 16:40:40.628030 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:40.628013 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:40:41.162874 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:41.162836 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerStarted","Data":"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3"} Apr 16 16:40:41.162874 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:41.162869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerStarted","Data":"3486c1fe7e6fdae00a9fa786187da474f4fe5a64353bf747b14852ea68694f5e"} Apr 16 16:40:42.166566 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:42.166533 2577 generic.go:358] "Generic (PLEG): container finished" podID="6995834d-a13d-4d54-8f2b-018a2b558002" containerID="3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3" exitCode=0 Apr 16 16:40:42.166975 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:42.166616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerDied","Data":"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3"} Apr 16 16:40:43.171494 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:43.171452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerStarted","Data":"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9"} Apr 16 16:40:43.171494 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:43.171495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerStarted","Data":"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0"} Apr 16 16:40:43.172055 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:43.171579 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:43.192250 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:43.192195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" podStartSLOduration=3.192175515 podStartE2EDuration="3.192175515s" podCreationTimestamp="2026-04-16 16:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:40:43.190791653 +0000 UTC m=+1103.922030079" watchObservedRunningTime="2026-04-16 16:40:43.192175515 +0000 UTC m=+1103.923413932" Apr 16 16:40:50.502723 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:50.502687 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:50.503210 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:50.502740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:50.505382 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:50.505353 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:40:51.198936 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:40:51.198903 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:41:12.202632 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:41:12.202604 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:42:27.015010 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.014966 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:42:27.015532 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.015372 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="main" containerID="cri-o://3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" gracePeriod=30 Apr 16 16:42:27.015532 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.015437 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="tokenizer" containerID="cri-o://09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" gracePeriod=30 Apr 16 16:42:27.363872 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.363850 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:42:27.399104 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location\") pod \"6995834d-a13d-4d54-8f2b-018a2b558002\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " Apr 16 16:42:27.399233 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399153 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds\") pod \"6995834d-a13d-4d54-8f2b-018a2b558002\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " Apr 16 16:42:27.399233 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399182 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpk6c\" (UniqueName: \"kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c\") pod \"6995834d-a13d-4d54-8f2b-018a2b558002\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " Apr 16 16:42:27.399233 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399222 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs\") pod \"6995834d-a13d-4d54-8f2b-018a2b558002\" (UID: \"6995834d-a13d-4d54-8f2b-018a2b558002\") " Apr 16 16:42:27.399469 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399441 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6995834d-a13d-4d54-8f2b-018a2b558002" (UID: "6995834d-a13d-4d54-8f2b-018a2b558002"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:27.399886 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.399856 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6995834d-a13d-4d54-8f2b-018a2b558002" (UID: "6995834d-a13d-4d54-8f2b-018a2b558002"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:42:27.401282 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.401259 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6995834d-a13d-4d54-8f2b-018a2b558002" (UID: "6995834d-a13d-4d54-8f2b-018a2b558002"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:42:27.401412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.401387 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c" (OuterVolumeSpecName: "kube-api-access-hpk6c") pod "6995834d-a13d-4d54-8f2b-018a2b558002" (UID: "6995834d-a13d-4d54-8f2b-018a2b558002"). InnerVolumeSpecName "kube-api-access-hpk6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:42:27.500663 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.500621 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:42:27.500663 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.500666 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6995834d-a13d-4d54-8f2b-018a2b558002-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:42:27.500842 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.500682 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpk6c\" (UniqueName: \"kubernetes.io/projected/6995834d-a13d-4d54-8f2b-018a2b558002-kube-api-access-hpk6c\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:42:27.500842 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.500694 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6995834d-a13d-4d54-8f2b-018a2b558002-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:42:27.502222 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502202 2577 generic.go:358] "Generic (PLEG): container finished" podID="6995834d-a13d-4d54-8f2b-018a2b558002" containerID="09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" exitCode=0 Apr 16 16:42:27.502268 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502223 2577 generic.go:358] "Generic (PLEG): container finished" podID="6995834d-a13d-4d54-8f2b-018a2b558002" containerID="3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" exitCode=0 Apr 16 16:42:27.502268 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerDied","Data":"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9"} Apr 16 16:42:27.502330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerDied","Data":"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0"} Apr 16 16:42:27.502330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502273 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" Apr 16 16:42:27.502330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p" event={"ID":"6995834d-a13d-4d54-8f2b-018a2b558002","Type":"ContainerDied","Data":"3486c1fe7e6fdae00a9fa786187da474f4fe5a64353bf747b14852ea68694f5e"} Apr 16 16:42:27.502330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.502297 2577 scope.go:117] "RemoveContainer" containerID="09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" Apr 16 16:42:27.510714 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.510696 2577 scope.go:117] "RemoveContainer" containerID="3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" Apr 16 16:42:27.517878 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.517844 2577 scope.go:117] "RemoveContainer" containerID="3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3" Apr 16 16:42:27.524516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.524500 2577 scope.go:117] "RemoveContainer" containerID="09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" Apr 16 16:42:27.525007 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:42:27.524981 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9\": container with ID starting with 09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9 not found: ID does not exist" containerID="09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" Apr 16 16:42:27.525193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.525017 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9"} err="failed to get container status \"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9\": rpc error: code = NotFound desc = could not find container \"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9\": container with ID starting with 09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9 not found: ID does not exist" Apr 16 16:42:27.525193 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.525047 2577 scope.go:117] "RemoveContainer" containerID="3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" Apr 16 16:42:27.525410 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:42:27.525294 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0\": container with ID starting with 3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0 not found: ID does not exist" containerID="3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" Apr 16 16:42:27.525410 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.525318 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0"} err="failed to get container status \"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0\": rpc error: code = NotFound desc = could not find container \"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0\": container with ID starting with 3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0 not found: ID does not exist" Apr 16 16:42:27.525410 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.525338 2577 scope.go:117] "RemoveContainer" containerID="3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3" Apr 16 16:42:27.525675 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:42:27.525620 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3\": container with ID starting with 3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3 not found: ID does not exist" containerID="3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3" Apr 16 16:42:27.526075 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526044 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3"} err="failed to get container status \"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3\": rpc error: code = NotFound desc = could not find container \"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3\": container with ID starting with 3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3 not found: ID does not exist" Apr 16 16:42:27.526164 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526087 2577 scope.go:117] "RemoveContainer" containerID="09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9" Apr 16 16:42:27.526379 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526347 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9"} err="failed to get container status \"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9\": rpc error: code = NotFound desc = could not find container \"09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9\": container with ID starting with 09707317fb15977c24d4a1cde011d58e1a91e0bd04d818b57f45936976dcbaf9 not found: ID does not exist" Apr 16 16:42:27.526448 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526381 2577 scope.go:117] "RemoveContainer" containerID="3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0" Apr 16 16:42:27.526619 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526598 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0"} err="failed to get container status \"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0\": rpc error: code = NotFound desc = could not find container \"3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0\": container with ID starting with 3253de392111a06c1ba26d36830e9b85ed9fad34e3f3fc755f7c4f8a005617d0 not found: ID does not exist" Apr 16 16:42:27.526724 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526621 2577 scope.go:117] "RemoveContainer" containerID="3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3" Apr 16 16:42:27.526858 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526840 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3"} err="failed to get container status \"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3\": rpc error: code = NotFound desc = could not find container \"3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3\": container with ID starting with 3b46e7991dca787e42f082db5c6212c208768bc8aca85f581b946bc94b7966c3 not found: ID does not exist" Apr 16 16:42:27.526953 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.526937 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:42:27.535144 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.535089 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6986d4b88629p"] Apr 16 16:42:27.880530 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:27.880456 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" path="/var/lib/kubelet/pods/6995834d-a13d-4d54-8f2b-018a2b558002/volumes" Apr 16 16:42:39.144223 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144192 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144590 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="storage-initializer" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144610 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="storage-initializer" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144621 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="tokenizer" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144630 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="tokenizer" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144695 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="main" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144704 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="main" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144788 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="main" Apr 16 16:42:39.144835 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.144799 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6995834d-a13d-4d54-8f2b-018a2b558002" containerName="tokenizer" Apr 16 16:42:39.147922 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.147898 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.151265 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.151243 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:42:39.151994 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.151967 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-9nckc\"" Apr 16 16:42:39.152095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.152005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:42:39.152095 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.152024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 16:42:39.164132 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.164113 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:42:39.290817 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.290786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.290943 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.290831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.290943 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.290856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.290943 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.290920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpprw\" (UniqueName: \"kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpprw\" (UniqueName: \"kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391801 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.391844 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.391814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.393975 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.393957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.400769 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.400716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpprw\" (UniqueName: \"kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw\") pod \"router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.457472 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.457446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:39.620814 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:39.620771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:42:39.623732 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:42:39.623692 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abdedb5_69c8_43da_bde4_6b8cf44cf34b.slice/crio-9033e1a9b9eeadb930304e4ea5eed38358082ec3385bb0802529cfad07af2428 WatchSource:0}: Error finding container 9033e1a9b9eeadb930304e4ea5eed38358082ec3385bb0802529cfad07af2428: Status 404 returned error can't find the container with id 9033e1a9b9eeadb930304e4ea5eed38358082ec3385bb0802529cfad07af2428 Apr 16 16:42:40.549880 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:40.549843 2577 generic.go:358] "Generic (PLEG): container finished" podID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerID="a1a5b5422b83dad93fa73217022347ebecdea7dd55231ee7d3ec4cf169893b6f" exitCode=0 Apr 16 16:42:40.550238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:40.549889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerDied","Data":"a1a5b5422b83dad93fa73217022347ebecdea7dd55231ee7d3ec4cf169893b6f"} Apr 16 16:42:40.550238 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:40.549916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerStarted","Data":"9033e1a9b9eeadb930304e4ea5eed38358082ec3385bb0802529cfad07af2428"} Apr 16 16:42:41.556031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:41.555993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerStarted","Data":"c2560f618590f8d8a7e8744246cab46b417ca31e0f8c15c660bc8b4d4ea24fb6"} Apr 16 16:42:41.556397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:41.556038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerStarted","Data":"68c3817a5b4ec33938879fc260aeb1e72e0b586498da873d47a4d90abc1c2bb3"} Apr 16 16:42:41.556397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:41.556119 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:41.578536 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:41.578489 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" podStartSLOduration=2.578477009 podStartE2EDuration="2.578477009s" podCreationTimestamp="2026-04-16 16:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:42:41.57677633 +0000 UTC m=+1222.308014757" watchObservedRunningTime="2026-04-16 16:42:41.578477009 +0000 UTC m=+1222.309715422" Apr 16 16:42:49.457914 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:49.457864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:49.457914 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:49.457926 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:49.465111 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:49.465084 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:42:49.583089 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:42:49.583065 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:43:10.586133 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:43:10.586055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:44:47.628282 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.628246 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:44:47.628831 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.628532 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="main" containerID="cri-o://68c3817a5b4ec33938879fc260aeb1e72e0b586498da873d47a4d90abc1c2bb3" gracePeriod=30 Apr 16 16:44:47.628831 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.628596 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="tokenizer" containerID="cri-o://c2560f618590f8d8a7e8744246cab46b417ca31e0f8c15c660bc8b4d4ea24fb6" gracePeriod=30 Apr 16 16:44:47.959770 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.959738 2577 generic.go:358] "Generic (PLEG): container finished" podID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerID="c2560f618590f8d8a7e8744246cab46b417ca31e0f8c15c660bc8b4d4ea24fb6" exitCode=0 Apr 16 16:44:47.959770 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.959763 2577 generic.go:358] "Generic (PLEG): container finished" podID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerID="68c3817a5b4ec33938879fc260aeb1e72e0b586498da873d47a4d90abc1c2bb3" exitCode=0 Apr 16 16:44:47.959969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.959794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerDied","Data":"c2560f618590f8d8a7e8744246cab46b417ca31e0f8c15c660bc8b4d4ea24fb6"} Apr 16 16:44:47.959969 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.959821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerDied","Data":"68c3817a5b4ec33938879fc260aeb1e72e0b586498da873d47a4d90abc1c2bb3"} Apr 16 16:44:47.982324 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:47.982301 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:44:48.070620 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.070586 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location\") pod \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " Apr 16 16:44:48.070832 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.070633 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds\") pod \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " Apr 16 16:44:48.070832 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.070730 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs\") pod \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " Apr 16 16:44:48.070832 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.070755 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpprw\" (UniqueName: \"kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw\") pod \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\" (UID: \"6abdedb5-69c8-43da-bde4-6b8cf44cf34b\") " Apr 16 16:44:48.071019 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.070984 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6abdedb5-69c8-43da-bde4-6b8cf44cf34b" (UID: "6abdedb5-69c8-43da-bde4-6b8cf44cf34b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:48.071351 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.071325 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6abdedb5-69c8-43da-bde4-6b8cf44cf34b" (UID: "6abdedb5-69c8-43da-bde4-6b8cf44cf34b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:48.073042 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.073014 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6abdedb5-69c8-43da-bde4-6b8cf44cf34b" (UID: "6abdedb5-69c8-43da-bde4-6b8cf44cf34b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:44:48.073156 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.073089 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw" (OuterVolumeSpecName: "kube-api-access-kpprw") pod "6abdedb5-69c8-43da-bde4-6b8cf44cf34b" (UID: "6abdedb5-69c8-43da-bde4-6b8cf44cf34b"). InnerVolumeSpecName "kube-api-access-kpprw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:44:48.171961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.171873 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:44:48.171961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.171905 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpprw\" (UniqueName: \"kubernetes.io/projected/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kube-api-access-kpprw\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:44:48.171961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.171919 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:44:48.171961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.171931 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6abdedb5-69c8-43da-bde4-6b8cf44cf34b-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:44:48.964850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.964815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" event={"ID":"6abdedb5-69c8-43da-bde4-6b8cf44cf34b","Type":"ContainerDied","Data":"9033e1a9b9eeadb930304e4ea5eed38358082ec3385bb0802529cfad07af2428"} Apr 16 16:44:48.965272 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.964863 2577 scope.go:117] "RemoveContainer" containerID="c2560f618590f8d8a7e8744246cab46b417ca31e0f8c15c660bc8b4d4ea24fb6" Apr 16 16:44:48.965272 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.964881 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6" Apr 16 16:44:48.973292 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.973272 2577 scope.go:117] "RemoveContainer" containerID="68c3817a5b4ec33938879fc260aeb1e72e0b586498da873d47a4d90abc1c2bb3" Apr 16 16:44:48.980422 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.980405 2577 scope.go:117] "RemoveContainer" containerID="a1a5b5422b83dad93fa73217022347ebecdea7dd55231ee7d3ec4cf169893b6f" Apr 16 16:44:48.987802 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.987782 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:44:48.993004 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:48.992983 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-cc4b6bc7f-tpxf6"] Apr 16 16:44:49.879513 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:49.879472 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" path="/var/lib/kubelet/pods/6abdedb5-69c8-43da-bde4-6b8cf44cf34b/volumes" Apr 16 16:44:53.356733 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.356692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357109 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="tokenizer" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357124 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="tokenizer" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357138 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="storage-initializer" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357147 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="storage-initializer" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357159 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="main" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357167 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="main" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357250 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="tokenizer" Apr 16 16:44:53.357305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.357263 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6abdedb5-69c8-43da-bde4-6b8cf44cf34b" containerName="main" Apr 16 16:44:53.362336 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.362317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.365084 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.365061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-b78zf\"" Apr 16 16:44:53.365208 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.365084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:44:53.365208 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.365083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 16:44:53.365967 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.365954 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:44:53.371926 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.371903 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:44:53.410713 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.410681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.410902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.410725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.410902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.410753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.410902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.410817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8bj\" (UniqueName: \"kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.511926 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.511873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8bj\" (UniqueName: \"kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.512107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.512002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.512107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.512051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.512107 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.512083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.512566 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.512512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.512566 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.512549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.514964 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.514941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.522142 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.522115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8bj\" (UniqueName: \"kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.673330 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.673241 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:53.811741 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.811705 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:44:53.815529 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:44:53.815480 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a8ddfe_6111_4cc5_a549_bf6275279ef1.slice/crio-f538a06cc0e707b2655d6b7bb00dbd0fac4d751d871f1a4901579c24ba9ec078 WatchSource:0}: Error finding container f538a06cc0e707b2655d6b7bb00dbd0fac4d751d871f1a4901579c24ba9ec078: Status 404 returned error can't find the container with id f538a06cc0e707b2655d6b7bb00dbd0fac4d751d871f1a4901579c24ba9ec078 Apr 16 16:44:53.983074 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.983036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerStarted","Data":"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a"} Apr 16 16:44:53.983074 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:53.983071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerStarted","Data":"f538a06cc0e707b2655d6b7bb00dbd0fac4d751d871f1a4901579c24ba9ec078"} Apr 16 16:44:54.987657 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:54.987608 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerID="4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a" exitCode=0 Apr 16 16:44:54.988126 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:54.987699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerDied","Data":"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a"} Apr 16 16:44:55.992476 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:55.992442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerStarted","Data":"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe"} Apr 16 16:44:55.992476 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:55.992476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerStarted","Data":"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125"} Apr 16 16:44:55.992965 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:55.992635 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:44:56.015035 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:44:56.014983 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" podStartSLOduration=3.014962879 podStartE2EDuration="3.014962879s" podCreationTimestamp="2026-04-16 16:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:44:56.012274629 +0000 UTC m=+1356.743513062" watchObservedRunningTime="2026-04-16 16:44:56.014962879 +0000 UTC m=+1356.746201295" Apr 16 16:45:03.674310 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:45:03.674270 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:45:03.674809 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:45:03.674324 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:45:03.677020 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:45:03.676999 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:45:04.018616 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:45:04.018576 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:45:25.021505 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:45:25.021476 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:48:33.550085 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.550050 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:48:33.552991 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.550366 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="main" containerID="cri-o://3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125" gracePeriod=30 Apr 16 16:48:33.552991 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.550409 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="tokenizer" containerID="cri-o://20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe" gracePeriod=30 Apr 16 16:48:33.675902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.675868 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerID="3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125" exitCode=0 Apr 16 16:48:33.676046 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.675933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerDied","Data":"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125"} Apr 16 16:48:33.906573 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.906548 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:48:33.987554 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.987527 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds\") pod \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " Apr 16 16:48:33.988012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.987562 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location\") pod \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " Apr 16 16:48:33.988012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.987590 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs\") pod \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " Apr 16 16:48:33.988012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.987634 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf8bj\" (UniqueName: \"kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj\") pod \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\" (UID: \"b9a8ddfe-6111-4cc5-a549-bf6275279ef1\") " Apr 16 16:48:33.988012 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.987878 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b9a8ddfe-6111-4cc5-a549-bf6275279ef1" (UID: "b9a8ddfe-6111-4cc5-a549-bf6275279ef1"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:33.988319 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.988292 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b9a8ddfe-6111-4cc5-a549-bf6275279ef1" (UID: "b9a8ddfe-6111-4cc5-a549-bf6275279ef1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:33.989961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.989933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj" (OuterVolumeSpecName: "kube-api-access-xf8bj") pod "b9a8ddfe-6111-4cc5-a549-bf6275279ef1" (UID: "b9a8ddfe-6111-4cc5-a549-bf6275279ef1"). InnerVolumeSpecName "kube-api-access-xf8bj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:48:33.989961 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:33.989952 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b9a8ddfe-6111-4cc5-a549-bf6275279ef1" (UID: "b9a8ddfe-6111-4cc5-a549-bf6275279ef1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:48:34.088543 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.088458 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xf8bj\" (UniqueName: \"kubernetes.io/projected/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kube-api-access-xf8bj\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:48:34.088543 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.088486 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:48:34.088543 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.088499 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:48:34.088543 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.088510 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b9a8ddfe-6111-4cc5-a549-bf6275279ef1-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:48:34.682166 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.682132 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerID="20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe" exitCode=0 Apr 16 16:48:34.682638 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.682212 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" Apr 16 16:48:34.682638 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.682226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerDied","Data":"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe"} Apr 16 16:48:34.682638 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.682280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48" event={"ID":"b9a8ddfe-6111-4cc5-a549-bf6275279ef1","Type":"ContainerDied","Data":"f538a06cc0e707b2655d6b7bb00dbd0fac4d751d871f1a4901579c24ba9ec078"} Apr 16 16:48:34.682638 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.682306 2577 scope.go:117] "RemoveContainer" containerID="20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe" Apr 16 16:48:34.690997 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.690981 2577 scope.go:117] "RemoveContainer" containerID="3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125" Apr 16 16:48:34.698283 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.698265 2577 scope.go:117] "RemoveContainer" containerID="4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a" Apr 16 16:48:34.704992 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.704975 2577 scope.go:117] "RemoveContainer" containerID="20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe" Apr 16 16:48:34.705251 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:48:34.705220 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe\": container with ID starting with 20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe not found: ID does not exist" containerID="20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe" Apr 16 16:48:34.705295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.705262 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe"} err="failed to get container status \"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe\": rpc error: code = NotFound desc = could not find container \"20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe\": container with ID starting with 20e19ccfecaca553a18ed54243ff34c3cd124c602f2f93e2dc25963855763fbe not found: ID does not exist" Apr 16 16:48:34.705295 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.705280 2577 scope.go:117] "RemoveContainer" containerID="3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125" Apr 16 16:48:34.705510 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:48:34.705491 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125\": container with ID starting with 3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125 not found: ID does not exist" containerID="3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125" Apr 16 16:48:34.705565 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.705518 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125"} err="failed to get container status \"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125\": rpc error: code = NotFound desc = could not find container \"3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125\": container with ID starting with 3ac0445c19641f4895cdf6ff3e38daed5c07997794b2e481e19f39c64dd1f125 not found: ID does not exist" Apr 16 16:48:34.705565 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.705533 2577 scope.go:117] "RemoveContainer" containerID="4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a" Apr 16 16:48:34.706222 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:48:34.705838 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a\": container with ID starting with 4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a not found: ID does not exist" containerID="4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a" Apr 16 16:48:34.706222 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.705868 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a"} err="failed to get container status \"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a\": rpc error: code = NotFound desc = could not find container \"4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a\": container with ID starting with 4e90c10f5a0cfd6030f83f68e17ea0d394232b05a731bd48ecd38e0ab769397a not found: ID does not exist" Apr 16 16:48:34.707702 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.707678 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:48:34.712201 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:34.712179 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche68s48"] Apr 16 16:48:35.880412 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:35.880371 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" path="/var/lib/kubelet/pods/b9a8ddfe-6111-4cc5-a549-bf6275279ef1/volumes" Apr 16 16:48:37.613654 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613603 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613902 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="storage-initializer" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613915 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="storage-initializer" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613926 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="tokenizer" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613932 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="tokenizer" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613945 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="main" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.613951 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="main" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.614003 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="main" Apr 16 16:48:37.614033 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.614011 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9a8ddfe-6111-4cc5-a549-bf6275279ef1" containerName="tokenizer" Apr 16 16:48:37.618774 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.618756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.621270 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.621244 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 16:48:37.621384 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.621300 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-ljnrl\"" Apr 16 16:48:37.621544 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.621531 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:48:37.622073 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.622059 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:48:37.628577 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.628554 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:48:37.716022 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.715991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.716163 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.716027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.716163 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.716059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.716163 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.716106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jb6m\" (UniqueName: \"kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817255 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817255 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817503 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jb6m\" (UniqueName: \"kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817561 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817711 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.817940 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.817923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.819908 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.819889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.826051 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.826030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jb6m\" (UniqueName: \"kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:37.928617 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:37.928585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:38.055189 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:38.055036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:48:38.057444 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:48:38.057414 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7988b283_ecf4_498a_aa17_867fa79eab4b.slice/crio-9af77bce79171b06ec0f7bc810553072dcdc3dc969cbe48c17044928c261f08d WatchSource:0}: Error finding container 9af77bce79171b06ec0f7bc810553072dcdc3dc969cbe48c17044928c261f08d: Status 404 returned error can't find the container with id 9af77bce79171b06ec0f7bc810553072dcdc3dc969cbe48c17044928c261f08d Apr 16 16:48:38.059758 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:38.059740 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:38.698199 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:38.698167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerStarted","Data":"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496"} Apr 16 16:48:38.698579 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:38.698205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerStarted","Data":"9af77bce79171b06ec0f7bc810553072dcdc3dc969cbe48c17044928c261f08d"} Apr 16 16:48:39.702326 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:39.702289 2577 generic.go:358] "Generic (PLEG): container finished" podID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerID="d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496" exitCode=0 Apr 16 16:48:39.702753 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:39.702345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerDied","Data":"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496"} Apr 16 16:48:40.707830 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:40.707788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerStarted","Data":"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f"} Apr 16 16:48:40.707830 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:40.707831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerStarted","Data":"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784"} Apr 16 16:48:40.708234 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:40.707850 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:40.733177 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:40.733133 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" podStartSLOduration=3.733118563 podStartE2EDuration="3.733118563s" podCreationTimestamp="2026-04-16 16:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:40.732118601 +0000 UTC m=+1581.463357053" watchObservedRunningTime="2026-04-16 16:48:40.733118563 +0000 UTC m=+1581.464357021" Apr 16 16:48:47.929365 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:47.929327 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:47.929365 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:47.929373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:47.932263 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:47.932241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:48:48.738088 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:48:48.738055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:49:09.742020 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:49:09.741941 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:51:23.777125 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.777092 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:23.780312 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.780296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.783775 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.783754 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 16:51:23.783884 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.783760 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:51:23.789125 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.789103 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:23.826188 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.826357 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.826357 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.826357 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.826545 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4p5\" (UniqueName: \"kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.826545 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.826447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927507 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4p5\" (UniqueName: \"kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927679 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.927980 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.927958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.928045 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.928025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.928109 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.928065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.929951 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.929932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.930341 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.930321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:23.936810 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:23.936789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4p5\" (UniqueName: \"kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5\") pod \"scheduler-inline-config-test-kserve-58544ffb65-tdgdw\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:24.091166 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:24.091093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:24.210724 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:24.210684 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:24.213811 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:51:24.213782 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e9868d2_81ab_4337_8fd5_e584b6cece57.slice/crio-7fa859e251f57332642ea6b6ae000c11a06554db5f4c06e64a5ed2ba735fae05 WatchSource:0}: Error finding container 7fa859e251f57332642ea6b6ae000c11a06554db5f4c06e64a5ed2ba735fae05: Status 404 returned error can't find the container with id 7fa859e251f57332642ea6b6ae000c11a06554db5f4c06e64a5ed2ba735fae05 Apr 16 16:51:25.222608 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:25.222562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerStarted","Data":"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6"} Apr 16 16:51:25.223122 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:25.222610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerStarted","Data":"7fa859e251f57332642ea6b6ae000c11a06554db5f4c06e64a5ed2ba735fae05"} Apr 16 16:51:29.236700 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:29.236666 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerID="b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6" exitCode=0 Apr 16 16:51:29.237168 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:29.236741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerDied","Data":"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6"} Apr 16 16:51:32.248712 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:32.248677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerStarted","Data":"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad"} Apr 16 16:51:32.268043 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:32.267996 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" podStartSLOduration=6.627210019 podStartE2EDuration="9.267982769s" podCreationTimestamp="2026-04-16 16:51:23 +0000 UTC" firstStartedPulling="2026-04-16 16:51:29.237830255 +0000 UTC m=+1749.969068649" lastFinishedPulling="2026-04-16 16:51:31.878603004 +0000 UTC m=+1752.609841399" observedRunningTime="2026-04-16 16:51:32.267104811 +0000 UTC m=+1752.998343228" watchObservedRunningTime="2026-04-16 16:51:32.267982769 +0000 UTC m=+1752.999221185" Apr 16 16:51:34.091211 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:34.091173 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:34.091211 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:34.091219 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:34.103630 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:34.103610 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:34.266986 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:34.266960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:56.651452 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:56.651415 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:56.651935 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:56.651804 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="main" containerID="cri-o://88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad" gracePeriod=30 Apr 16 16:51:56.908249 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:56.908191 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:57.004388 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004353 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004388 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004399 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004626 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004461 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s4p5\" (UniqueName: \"kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004626 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004512 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004626 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004530 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004626 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004549 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache\") pod \"4e9868d2-81ab-4337-8fd5-e584b6cece57\" (UID: \"4e9868d2-81ab-4337-8fd5-e584b6cece57\") " Apr 16 16:51:57.004838 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004694 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home" (OuterVolumeSpecName: "home") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.004975 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.004950 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache" (OuterVolumeSpecName: "model-cache") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.006781 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.006747 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5" (OuterVolumeSpecName: "kube-api-access-6s4p5") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "kube-api-access-6s4p5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:57.006879 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.006819 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:57.007291 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.007271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm" (OuterVolumeSpecName: "dshm") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.062948 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.062914 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e9868d2-81ab-4337-8fd5-e584b6cece57" (UID: "4e9868d2-81ab-4337-8fd5-e584b6cece57"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.106018 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.105993 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-dshm\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.106018 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.106015 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4e9868d2-81ab-4337-8fd5-e584b6cece57-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.106205 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.106025 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-model-cache\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.106205 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.106034 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.106205 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.106043 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4e9868d2-81ab-4337-8fd5-e584b6cece57-home\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.106205 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.106052 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6s4p5\" (UniqueName: \"kubernetes.io/projected/4e9868d2-81ab-4337-8fd5-e584b6cece57-kube-api-access-6s4p5\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.334706 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.334669 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerID="88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad" exitCode=0 Apr 16 16:51:57.334850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.334722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerDied","Data":"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad"} Apr 16 16:51:57.334850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.334761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" event={"ID":"4e9868d2-81ab-4337-8fd5-e584b6cece57","Type":"ContainerDied","Data":"7fa859e251f57332642ea6b6ae000c11a06554db5f4c06e64a5ed2ba735fae05"} Apr 16 16:51:57.334850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.334766 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw" Apr 16 16:51:57.334850 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.334775 2577 scope.go:117] "RemoveContainer" containerID="88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad" Apr 16 16:51:57.343475 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.343455 2577 scope.go:117] "RemoveContainer" containerID="b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6" Apr 16 16:51:57.363038 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.363019 2577 scope.go:117] "RemoveContainer" containerID="88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad" Apr 16 16:51:57.363304 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:51:57.363282 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad\": container with ID starting with 88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad not found: ID does not exist" containerID="88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad" Apr 16 16:51:57.363355 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.363318 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad"} err="failed to get container status \"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad\": rpc error: code = NotFound desc = could not find container \"88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad\": container with ID starting with 88660b43bf6fe50993451efdb59cea60e2f97e085d12970656642c5d8ef791ad not found: ID does not exist" Apr 16 16:51:57.363355 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.363338 2577 scope.go:117] "RemoveContainer" containerID="b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6" Apr 16 16:51:57.363558 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:51:57.363538 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6\": container with ID starting with b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6 not found: ID does not exist" containerID="b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6" Apr 16 16:51:57.363598 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.363564 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6"} err="failed to get container status \"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6\": rpc error: code = NotFound desc = could not find container \"b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6\": container with ID starting with b80b3826831cf2a13e639a48c59dd35691cd13d09aa885e9172f9152612387d6 not found: ID does not exist" Apr 16 16:51:57.363911 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.363883 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:57.366911 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.366892 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-58544ffb65-tdgdw"] Apr 16 16:51:57.880897 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:51:57.880863 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" path="/var/lib/kubelet/pods/4e9868d2-81ab-4337-8fd5-e584b6cece57/volumes" Apr 16 16:52:13.975584 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975546 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:52:13.976031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975850 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="storage-initializer" Apr 16 16:52:13.976031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975861 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="storage-initializer" Apr 16 16:52:13.976031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975878 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="main" Apr 16 16:52:13.976031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975883 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="main" Apr 16 16:52:13.976031 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.975931 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e9868d2-81ab-4337-8fd5-e584b6cece57" containerName="main" Apr 16 16:52:13.977874 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.977856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:13.984074 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.984049 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-q786g\"" Apr 16 16:52:13.984196 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.984092 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 16:52:13.999425 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:13.999402 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:52:14.033006 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.032974 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9qn\" (UniqueName: \"kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.033198 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.033093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.033198 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.033172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.033305 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.033209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134452 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134452 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134696 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9qn\" (UniqueName: \"kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134696 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134848 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.134914 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.134889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.137234 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.137212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.144484 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.144456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9qn\" (UniqueName: \"kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn\") pod \"router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.287861 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.287797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:14.422871 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:14.422782 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:52:14.425590 ip-10-0-132-246 kubenswrapper[2577]: W0416 16:52:14.425558 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a19bfc_7b4c_4864_a516_f3e9d39f3a95.slice/crio-88cf1fb617bef771dbf7cd5bafcc16d8f272d4f3e789753fe9a646077a2f65fd WatchSource:0}: Error finding container 88cf1fb617bef771dbf7cd5bafcc16d8f272d4f3e789753fe9a646077a2f65fd: Status 404 returned error can't find the container with id 88cf1fb617bef771dbf7cd5bafcc16d8f272d4f3e789753fe9a646077a2f65fd Apr 16 16:52:15.327513 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.327478 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:52:15.328003 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.327804 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="main" containerID="cri-o://9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" gracePeriod=30 Apr 16 16:52:15.328003 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.327841 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="tokenizer" containerID="cri-o://7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" gracePeriod=30 Apr 16 16:52:15.397842 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.397805 2577 generic.go:358] "Generic (PLEG): container finished" podID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerID="fc345f13ecbf6ed25446ecd416bc3f9d6a6cb96e82fe3242f6ac6b2cfa4d8d15" exitCode=0 Apr 16 16:52:15.398039 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.397852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerDied","Data":"fc345f13ecbf6ed25446ecd416bc3f9d6a6cb96e82fe3242f6ac6b2cfa4d8d15"} Apr 16 16:52:15.398039 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.397892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerStarted","Data":"88cf1fb617bef771dbf7cd5bafcc16d8f272d4f3e789753fe9a646077a2f65fd"} Apr 16 16:52:15.690763 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.690740 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:52:15.751135 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751099 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location\") pod \"7988b283-ecf4-498a-aa17-867fa79eab4b\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " Apr 16 16:52:15.751354 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751176 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs\") pod \"7988b283-ecf4-498a-aa17-867fa79eab4b\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " Apr 16 16:52:15.751354 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751224 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds\") pod \"7988b283-ecf4-498a-aa17-867fa79eab4b\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " Apr 16 16:52:15.751354 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jb6m\" (UniqueName: \"kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m\") pod \"7988b283-ecf4-498a-aa17-867fa79eab4b\" (UID: \"7988b283-ecf4-498a-aa17-867fa79eab4b\") " Apr 16 16:52:15.751681 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751525 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7988b283-ecf4-498a-aa17-867fa79eab4b" (UID: "7988b283-ecf4-498a-aa17-867fa79eab4b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:15.752105 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.751968 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7988b283-ecf4-498a-aa17-867fa79eab4b" (UID: "7988b283-ecf4-498a-aa17-867fa79eab4b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:15.753526 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.753503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7988b283-ecf4-498a-aa17-867fa79eab4b" (UID: "7988b283-ecf4-498a-aa17-867fa79eab4b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:15.753606 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.753590 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m" (OuterVolumeSpecName: "kube-api-access-5jb6m") pod "7988b283-ecf4-498a-aa17-867fa79eab4b" (UID: "7988b283-ecf4-498a-aa17-867fa79eab4b"). InnerVolumeSpecName "kube-api-access-5jb6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:15.852397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.852320 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:52:15.852397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.852346 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jb6m\" (UniqueName: \"kubernetes.io/projected/7988b283-ecf4-498a-aa17-867fa79eab4b-kube-api-access-5jb6m\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:52:15.852397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.852357 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7988b283-ecf4-498a-aa17-867fa79eab4b-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:52:15.852397 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:15.852367 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7988b283-ecf4-498a-aa17-867fa79eab4b-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:52:16.404005 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.403923 2577 generic.go:358] "Generic (PLEG): container finished" podID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerID="7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" exitCode=0 Apr 16 16:52:16.404005 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.403949 2577 generic.go:358] "Generic (PLEG): container finished" podID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerID="9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" exitCode=0 Apr 16 16:52:16.404005 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.403976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerDied","Data":"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f"} Apr 16 16:52:16.404498 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.404009 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" Apr 16 16:52:16.404498 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.404025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerDied","Data":"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784"} Apr 16 16:52:16.404498 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.404042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd" event={"ID":"7988b283-ecf4-498a-aa17-867fa79eab4b","Type":"ContainerDied","Data":"9af77bce79171b06ec0f7bc810553072dcdc3dc969cbe48c17044928c261f08d"} Apr 16 16:52:16.404498 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.404060 2577 scope.go:117] "RemoveContainer" containerID="7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" Apr 16 16:52:16.407010 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.406985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerStarted","Data":"a983b84a0a2bdda18f6dcb349acb28d15816ab342e9d27fe334823163ba4c759"} Apr 16 16:52:16.407123 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.407015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerStarted","Data":"0d07af2914e0784c52ef75d3e866181b5b9ea824f08c5ef44255f78e4c001c30"} Apr 16 16:52:16.407189 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.407175 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:16.412204 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.412185 2577 scope.go:117] "RemoveContainer" containerID="9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" Apr 16 16:52:16.419265 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.419249 2577 scope.go:117] "RemoveContainer" containerID="d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496" Apr 16 16:52:16.426023 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426007 2577 scope.go:117] "RemoveContainer" containerID="7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" Apr 16 16:52:16.426283 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:52:16.426264 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f\": container with ID starting with 7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f not found: ID does not exist" containerID="7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" Apr 16 16:52:16.426332 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426292 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f"} err="failed to get container status \"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f\": rpc error: code = NotFound desc = could not find container \"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f\": container with ID starting with 7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f not found: ID does not exist" Apr 16 16:52:16.426332 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426309 2577 scope.go:117] "RemoveContainer" containerID="9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" Apr 16 16:52:16.426547 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:52:16.426531 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784\": container with ID starting with 9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784 not found: ID does not exist" containerID="9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" Apr 16 16:52:16.426592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426552 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784"} err="failed to get container status \"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784\": rpc error: code = NotFound desc = could not find container \"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784\": container with ID starting with 9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784 not found: ID does not exist" Apr 16 16:52:16.426592 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426566 2577 scope.go:117] "RemoveContainer" containerID="d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496" Apr 16 16:52:16.426812 ip-10-0-132-246 kubenswrapper[2577]: E0416 16:52:16.426792 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496\": container with ID starting with d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496 not found: ID does not exist" containerID="d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496" Apr 16 16:52:16.426876 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426822 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496"} err="failed to get container status \"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496\": rpc error: code = NotFound desc = could not find container \"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496\": container with ID starting with d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496 not found: ID does not exist" Apr 16 16:52:16.426876 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.426843 2577 scope.go:117] "RemoveContainer" containerID="7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f" Apr 16 16:52:16.427069 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.427042 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f"} err="failed to get container status \"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f\": rpc error: code = NotFound desc = could not find container \"7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f\": container with ID starting with 7c75d8569040e0068016a050d0f9caf16d79385f6474c50d183ce8476bcab44f not found: ID does not exist" Apr 16 16:52:16.427127 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.427072 2577 scope.go:117] "RemoveContainer" containerID="9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784" Apr 16 16:52:16.427276 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.427258 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784"} err="failed to get container status \"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784\": rpc error: code = NotFound desc = could not find container \"9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784\": container with ID starting with 9310ad44cc38dfa0c2afae291552d35e3b62c691016568818f6537310b982784 not found: ID does not exist" Apr 16 16:52:16.427326 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.427277 2577 scope.go:117] "RemoveContainer" containerID="d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496" Apr 16 16:52:16.427449 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.427433 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496"} err="failed to get container status \"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496\": rpc error: code = NotFound desc = could not find container \"d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496\": container with ID starting with d239cdea9534b378429d197d7c3b2c119ad4a5f4b197bc26c765daaf42f22496 not found: ID does not exist" Apr 16 16:52:16.439541 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.439496 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" podStartSLOduration=3.439482839 podStartE2EDuration="3.439482839s" podCreationTimestamp="2026-04-16 16:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:16.436229713 +0000 UTC m=+1797.167468167" watchObservedRunningTime="2026-04-16 16:52:16.439482839 +0000 UTC m=+1797.170721256" Apr 16 16:52:16.452315 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.452291 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:52:16.454479 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:16.454447 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6869fqgjgd"] Apr 16 16:52:17.880605 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:17.880572 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" path="/var/lib/kubelet/pods/7988b283-ecf4-498a-aa17-867fa79eab4b/volumes" Apr 16 16:52:24.288105 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:24.288071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:24.288105 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:24.288111 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:24.290899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:24.290871 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:24.434691 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:24.434665 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:52:45.438827 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:52:45.438792 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:54:56.680424 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.680386 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:54:56.680956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.680745 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="main" containerID="cri-o://0d07af2914e0784c52ef75d3e866181b5b9ea824f08c5ef44255f78e4c001c30" gracePeriod=30 Apr 16 16:54:56.680956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.680829 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="tokenizer" containerID="cri-o://a983b84a0a2bdda18f6dcb349acb28d15816ab342e9d27fe334823163ba4c759" gracePeriod=30 Apr 16 16:54:56.914261 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.914230 2577 generic.go:358] "Generic (PLEG): container finished" podID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerID="a983b84a0a2bdda18f6dcb349acb28d15816ab342e9d27fe334823163ba4c759" exitCode=0 Apr 16 16:54:56.914261 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.914256 2577 generic.go:358] "Generic (PLEG): container finished" podID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerID="0d07af2914e0784c52ef75d3e866181b5b9ea824f08c5ef44255f78e4c001c30" exitCode=0 Apr 16 16:54:56.914485 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.914284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerDied","Data":"a983b84a0a2bdda18f6dcb349acb28d15816ab342e9d27fe334823163ba4c759"} Apr 16 16:54:56.914485 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:56.914321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerDied","Data":"0d07af2914e0784c52ef75d3e866181b5b9ea824f08c5ef44255f78e4c001c30"} Apr 16 16:54:57.030698 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.030675 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:54:57.059427 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.059402 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn9qn\" (UniqueName: \"kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn\") pod \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " Apr 16 16:54:57.059596 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.059437 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location\") pod \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " Apr 16 16:54:57.059596 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.059462 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds\") pod \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " Apr 16 16:54:57.059596 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.059555 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs\") pod \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\" (UID: \"04a19bfc-7b4c-4864-a516-f3e9d39f3a95\") " Apr 16 16:54:57.059956 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.059929 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "04a19bfc-7b4c-4864-a516-f3e9d39f3a95" (UID: "04a19bfc-7b4c-4864-a516-f3e9d39f3a95"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:57.060392 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.060346 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04a19bfc-7b4c-4864-a516-f3e9d39f3a95" (UID: "04a19bfc-7b4c-4864-a516-f3e9d39f3a95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:57.061882 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.061858 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "04a19bfc-7b4c-4864-a516-f3e9d39f3a95" (UID: "04a19bfc-7b4c-4864-a516-f3e9d39f3a95"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:54:57.062051 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.062032 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn" (OuterVolumeSpecName: "kube-api-access-vn9qn") pod "04a19bfc-7b4c-4864-a516-f3e9d39f3a95" (UID: "04a19bfc-7b4c-4864-a516-f3e9d39f3a95"). InnerVolumeSpecName "kube-api-access-vn9qn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:57.161016 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.160987 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kserve-provision-location\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:54:57.161016 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.161011 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tokenizer-uds\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:54:57.161016 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.161022 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-tls-certs\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:54:57.161240 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.161030 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vn9qn\" (UniqueName: \"kubernetes.io/projected/04a19bfc-7b4c-4864-a516-f3e9d39f3a95-kube-api-access-vn9qn\") on node \"ip-10-0-132-246.ec2.internal\" DevicePath \"\"" Apr 16 16:54:57.918333 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.918299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" event={"ID":"04a19bfc-7b4c-4864-a516-f3e9d39f3a95","Type":"ContainerDied","Data":"88cf1fb617bef771dbf7cd5bafcc16d8f272d4f3e789753fe9a646077a2f65fd"} Apr 16 16:54:57.918721 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.918307 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj" Apr 16 16:54:57.918721 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.918343 2577 scope.go:117] "RemoveContainer" containerID="a983b84a0a2bdda18f6dcb349acb28d15816ab342e9d27fe334823163ba4c759" Apr 16 16:54:57.926175 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.926161 2577 scope.go:117] "RemoveContainer" containerID="0d07af2914e0784c52ef75d3e866181b5b9ea824f08c5ef44255f78e4c001c30" Apr 16 16:54:57.934902 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.934878 2577 scope.go:117] "RemoveContainer" containerID="fc345f13ecbf6ed25446ecd416bc3f9d6a6cb96e82fe3242f6ac6b2cfa4d8d15" Apr 16 16:54:57.939363 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.939338 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:54:57.949743 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:57.949720 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-55f8fdb599zwbj"] Apr 16 16:54:59.879722 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:54:59.879683 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" path="/var/lib/kubelet/pods/04a19bfc-7b4c-4864-a516-f3e9d39f3a95/volumes" Apr 16 16:55:26.884040 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:26.883994 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-cq89s_b75fc954-79ff-4ddb-9c6f-23ec26c7fb31/discovery/0.log" Apr 16 16:55:27.724619 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:27.724588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-cq89s_b75fc954-79ff-4ddb-9c6f-23ec26c7fb31/discovery/0.log" Apr 16 16:55:28.580176 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:28.580137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-s6qwr_41397218-8c5a-4010-a70e-f99a86f53581/authorino/0.log" Apr 16 16:55:28.673045 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:28.673019 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-vbzpd_2d463226-474e-4925-b6b6-0a6bff73827a/manager/0.log" Apr 16 16:55:28.701557 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:28.701523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-bkfpn_00a4139e-4585-46ab-835c-2d4eab31d934/manager/0.log" Apr 16 16:55:33.970069 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:33.970041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7d55g_9adfc688-8cd4-4e19-b964-829c6ec785ff/global-pull-secret-syncer/0.log" Apr 16 16:55:34.122023 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:34.121992 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lqxs7_c8bcc83b-1135-49eb-a662-f76883a04c53/konnectivity-agent/0.log" Apr 16 16:55:34.233146 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:34.233058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-246.ec2.internal_3d155529b212b181b4766962e45b3a8b/haproxy/0.log" Apr 16 16:55:37.914066 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:37.914039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-s6qwr_41397218-8c5a-4010-a70e-f99a86f53581/authorino/0.log" Apr 16 16:55:38.080674 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:38.080621 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-vbzpd_2d463226-474e-4925-b6b6-0a6bff73827a/manager/0.log" Apr 16 16:55:38.133952 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:38.133925 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-bkfpn_00a4139e-4585-46ab-835c-2d4eab31d934/manager/0.log" Apr 16 16:55:39.820941 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:39.820904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vms6s_aae7248d-bc69-473b-8c2d-45d55385b6a5/node-exporter/0.log" Apr 16 16:55:39.842420 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:39.842369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vms6s_aae7248d-bc69-473b-8c2d-45d55385b6a5/kube-rbac-proxy/0.log" Apr 16 16:55:39.864899 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:39.864877 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vms6s_aae7248d-bc69-473b-8c2d-45d55385b6a5/init-textfile/0.log" Apr 16 16:55:42.659542 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659505 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg"] Apr 16 16:55:42.659944 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659914 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="tokenizer" Apr 16 16:55:42.659944 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659934 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="tokenizer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659948 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="storage-initializer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659955 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="storage-initializer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659963 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="storage-initializer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659970 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="storage-initializer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659978 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="main" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659984 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="main" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659989 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="main" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.659995 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="main" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660014 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="tokenizer" Apr 16 16:55:42.660025 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660020 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="tokenizer" Apr 16 16:55:42.660307 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660070 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="tokenizer" Apr 16 16:55:42.660307 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660083 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7988b283-ecf4-498a-aa17-867fa79eab4b" containerName="main" Apr 16 16:55:42.660307 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660093 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="main" Apr 16 16:55:42.660307 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.660099 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a19bfc-7b4c-4864-a516-f3e9d39f3a95" containerName="tokenizer" Apr 16 16:55:42.662572 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.662552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.665063 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.665041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"kube-root-ca.crt\"" Apr 16 16:55:42.666034 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.666020 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"openshift-service-ca.crt\"" Apr 16 16:55:42.666034 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.666028 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pfzkj\"/\"default-dockercfg-q75nm\"" Apr 16 16:55:42.671677 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.671636 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg"] Apr 16 16:55:42.800246 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.800215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-sys\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.800246 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.800250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-lib-modules\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.800445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.800288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-proc\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.800445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.800364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-podres\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.800445 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.800405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgt4\" (UniqueName: \"kubernetes.io/projected/abb5872b-d130-4567-b79b-99ed351d8178-kube-api-access-ptgt4\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901032 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.900994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-sys\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-lib-modules\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-proc\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-sys\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-podres\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901184 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-proc\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901343 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgt4\" (UniqueName: \"kubernetes.io/projected/abb5872b-d130-4567-b79b-99ed351d8178-kube-api-access-ptgt4\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901343 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-lib-modules\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.901343 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.901271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abb5872b-d130-4567-b79b-99ed351d8178-podres\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.910601 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.910542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgt4\" (UniqueName: \"kubernetes.io/projected/abb5872b-d130-4567-b79b-99ed351d8178-kube-api-access-ptgt4\") pod \"perf-node-gather-daemonset-zg9lg\" (UID: \"abb5872b-d130-4567-b79b-99ed351d8178\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:42.973414 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:42.973387 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:43.116263 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:43.116131 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg"] Apr 16 16:55:43.121258 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:43.121235 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:55:44.057443 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.057403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" event={"ID":"abb5872b-d130-4567-b79b-99ed351d8178","Type":"ContainerStarted","Data":"d7c6539136c86a34f52eeef2284c0c3677ad663a1a88415245de1986607ec3cf"} Apr 16 16:55:44.057957 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.057450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" event={"ID":"abb5872b-d130-4567-b79b-99ed351d8178","Type":"ContainerStarted","Data":"56350ce57edfb8cc09e0e87db61725ce56812ecac32d3405dbc568a9e816a060"} Apr 16 16:55:44.057957 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.057544 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:44.075823 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.075773 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" podStartSLOduration=2.075758017 podStartE2EDuration="2.075758017s" podCreationTimestamp="2026-04-16 16:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:55:44.074465003 +0000 UTC m=+2004.805703421" watchObservedRunningTime="2026-04-16 16:55:44.075758017 +0000 UTC m=+2004.806996433" Apr 16 16:55:44.394271 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.394186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8nvl6_435f0085-97d9-46f8-973a-ffb39094715d/dns/0.log" Apr 16 16:55:44.417890 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.417863 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8nvl6_435f0085-97d9-46f8-973a-ffb39094715d/kube-rbac-proxy/0.log" Apr 16 16:55:44.612590 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:44.612560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jchwl_9494261b-183d-4f87-ae51-80217757eafa/dns-node-resolver/0.log" Apr 16 16:55:45.117918 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:45.117884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-628rf_ff2dc9df-f5a6-47e5-9597-5d45855573cd/node-ca/0.log" Apr 16 16:55:46.078339 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:46.078307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-cq89s_b75fc954-79ff-4ddb-9c6f-23ec26c7fb31/discovery/0.log" Apr 16 16:55:46.675038 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:46.675010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2cpdc_aab232ac-48c1-4811-b5d5-6ae9bf4d5040/serve-healthcheck-canary/0.log" Apr 16 16:55:47.283198 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:47.283164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j4nh_8cfac8c1-5283-430f-85c6-be8d1e0f94cc/kube-rbac-proxy/0.log" Apr 16 16:55:47.312502 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:47.312474 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j4nh_8cfac8c1-5283-430f-85c6-be8d1e0f94cc/exporter/0.log" Apr 16 16:55:47.378871 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:47.378830 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j4nh_8cfac8c1-5283-430f-85c6-be8d1e0f94cc/extractor/0.log" Apr 16 16:55:50.070125 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:50.070098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-zg9lg" Apr 16 16:55:51.122909 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:51.122876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-55c74f6fbc-xzj2m_77bd7dcc-9785-4f85-9332-86685486c2b2/manager/0.log" Apr 16 16:55:51.220090 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:51.220065 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-ksqsj_dff582a4-3ff7-4498-aa36-170e46df357d/server/0.log" Apr 16 16:55:51.553254 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:51.553227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-pkj5c_da0a4793-5bf1-47fa-81c5-09bb9b89e2d9/seaweedfs/0.log" Apr 16 16:55:58.641516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.641483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/kube-multus-additional-cni-plugins/0.log" Apr 16 16:55:58.669367 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.669343 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/egress-router-binary-copy/0.log" Apr 16 16:55:58.711132 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.711109 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/cni-plugins/0.log" Apr 16 16:55:58.761194 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.761171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/bond-cni-plugin/0.log" Apr 16 16:55:58.818209 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.818186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/routeoverride-cni/0.log" Apr 16 16:55:58.845231 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.845204 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/whereabouts-cni-bincopy/0.log" Apr 16 16:55:58.868553 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:58.868531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b7fht_566f5317-730d-4bea-9936-998ff669835f/whereabouts-cni/0.log" Apr 16 16:55:59.155516 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:59.155492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wmqwd_53a4fb09-6477-4a78-b6b9-b6dfa2c3499a/kube-multus/0.log" Apr 16 16:55:59.181472 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:59.181441 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2wd9w_a648a078-0f71-4a2f-a255-ad1937929932/network-metrics-daemon/0.log" Apr 16 16:55:59.208920 ip-10-0-132-246 kubenswrapper[2577]: I0416 16:55:59.208895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2wd9w_a648a078-0f71-4a2f-a255-ad1937929932/kube-rbac-proxy/0.log"