Apr 22 17:31:39.224850 ip-10-0-133-169 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:31:39.224863 ip-10-0-133-169 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:31:39.224869 ip-10-0-133-169 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:31:39.225109 ip-10-0-133-169 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:31:49.430771 ip-10-0-133-169 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:31:49.430787 ip-10-0-133-169 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 9a81e109b2fb4bfe9aa58275bc0de9ba -- Apr 22 17:34:03.914297 ip-10-0-133-169 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:04.288074 ip-10-0-133-169 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:04.288074 ip-10-0-133-169 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:04.288074 ip-10-0-133-169 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:04.288074 ip-10-0-133-169 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:04.288074 ip-10-0-133-169 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:04.289530 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.289444 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:04.294011 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.293992 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:04.294011 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294010 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:04.294011 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294013 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294016 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294020 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294023 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294026 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294029 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294031 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294034 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294037 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294040 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294043 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294046 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294049 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294052 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294054 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294057 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294060 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294062 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294065 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294068 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:04.294107 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294071 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294073 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294076 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294079 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294081 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294084 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294086 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294089 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294091 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294095 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294097 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294100 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294105 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294109 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294112 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294115 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294118 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294123 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294127 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:04.294590 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294132 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294137 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294140 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294144 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294147 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294150 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294153 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294155 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294158 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294160 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294163 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294165 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294168 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294171 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294173 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294178 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294181 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294183 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294186 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294189 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:04.295074 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294191 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294194 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294197 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294199 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294202 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294205 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294208 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294211 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294213 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294216 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294218 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294221 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294223 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294226 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294229 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294232 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294235 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294238 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294241 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294243 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:04.295575 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294246 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:04.296073 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294248 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:04.296073 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294252 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:04.296073 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294254 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:04.296073 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.294257 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:04.296197 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296184 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:04.296197 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296196 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296201 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296204 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296207 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296210 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296213 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296215 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296218 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296221 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296236 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296240 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296244 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296247 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296250 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296253 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296255 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296259 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:04.296253 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296261 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296265 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296267 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296270 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296273 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296275 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296278 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296280 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296282 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296285 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296288 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296290 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296293 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296295 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296298 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296301 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296303 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296306 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296309 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296311 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:04.296666 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296313 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296316 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296319 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296321 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296323 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296326 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296328 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296332 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296336 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296339 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296341 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296344 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296346 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296348 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296352 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296355 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296360 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296364 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296368 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:04.297183 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296371 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296375 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296378 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296381 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296384 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296387 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296390 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296392 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296395 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296398 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296400 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296404 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296406 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296409 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296412 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296414 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296417 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296420 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296423 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296425 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:04.297643 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296428 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296431 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296434 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296436 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296439 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296441 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296443 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296447 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.296449 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296525 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296537 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296544 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296549 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296553 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296557 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296562 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296567 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296570 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296573 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296577 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296580 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296583 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:04.298153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296586 2568 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296589 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296593 2568 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296596 2568 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296599 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296602 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296607 2568 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296610 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296613 2568 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296616 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296620 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296626 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296630 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296633 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296637 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296640 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296643 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296646 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296649 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296652 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296657 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296660 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296663 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296666 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296669 2568 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:04.298687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296672 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296678 2568 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296681 2568 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296684 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296687 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296690 2568 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296694 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296697 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296700 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296703 2568 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296706 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296709 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296712 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296715 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296718 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296721 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296724 2568 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296728 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296731 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296735 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296738 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296742 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296745 2568 flags.go:64] FLAG: --help="false" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296748 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296751 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:04.299347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296754 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296757 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296760 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296763 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296766 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296769 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296772 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296775 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296778 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296781 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296788 2568 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296791 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296795 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296798 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296802 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296805 2568 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296807 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296810 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296813 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296819 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296822 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296825 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296828 2568 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:04.299979 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296831 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296834 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296838 2568 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296841 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296846 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296849 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296853 2568 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296857 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296860 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296863 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296866 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296869 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296872 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296875 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296883 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296886 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296889 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296892 2568 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296895 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296901 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296906 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296910 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296913 2568 flags.go:64] FLAG: --port="10250" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296916 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:04.300537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296919 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-003a89db427e37398" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296922 2568 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296925 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296928 2568 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296931 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296948 2568 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296954 2568 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296958 2568 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296961 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296964 2568 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296968 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296971 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296974 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296977 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296982 2568 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296985 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296988 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296991 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296994 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.296997 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297000 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297004 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297007 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297010 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297013 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297016 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:04.301176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297019 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297022 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297027 2568 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297030 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297035 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297038 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297041 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297060 2568 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297065 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297068 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297071 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297075 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297078 2568 flags.go:64] FLAG: --v="2" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297082 2568 flags.go:64] FLAG: --version="false" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297087 2568 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297091 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.297094 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297203 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297207 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297210 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297214 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297218 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297221 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297224 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:04.301809 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297226 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297229 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297232 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297234 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297237 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297240 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297242 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297245 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297247 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297250 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297253 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297255 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297258 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297260 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297263 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297266 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297268 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297270 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297273 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:04.302445 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297275 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297278 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297280 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297283 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297285 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297288 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297290 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297293 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297295 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297299 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297301 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297304 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297307 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297309 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297312 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297314 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297317 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297319 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297322 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297324 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:04.302923 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297327 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297329 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297332 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297336 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297340 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297343 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297346 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297348 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297351 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297354 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297356 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297359 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297361 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297364 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297366 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297369 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297372 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297374 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297377 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297379 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:04.303762 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297382 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297387 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297390 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297393 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297396 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297399 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297401 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297404 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297406 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297409 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297411 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297414 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297416 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297418 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297421 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297424 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297426 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297429 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297431 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:04.304650 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.297434 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.298026 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.305073 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.305097 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305170 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305178 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305184 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305189 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305193 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305198 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305202 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305206 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305211 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305215 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305219 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305223 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:04.305491 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305227 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305231 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305237 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305241 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305246 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305250 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305254 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305259 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305263 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305267 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305271 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305275 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305279 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305283 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305287 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305291 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305296 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305300 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305304 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305310 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:04.306209 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305315 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305319 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305323 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305328 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305333 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305337 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305341 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305346 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305350 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305354 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305358 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305362 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305366 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305370 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305374 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305379 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305383 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305387 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305391 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:04.306783 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305395 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305399 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305403 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305407 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305412 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305416 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305420 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305424 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305429 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305435 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305442 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305447 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305452 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305457 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305462 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305467 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305472 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305476 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305481 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305485 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:04.307508 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305488 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305493 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305497 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305501 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305506 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305510 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305515 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305519 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305523 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305529 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305533 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305539 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305546 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305551 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305555 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:04.308027 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.305563 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305735 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305744 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305749 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305754 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305758 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305763 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305767 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305771 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305777 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305781 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305786 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305791 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305795 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305799 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305803 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305808 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305812 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305816 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305821 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:04.308412 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305825 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305829 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305833 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305837 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305842 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305846 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305850 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305854 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305858 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305862 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305867 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305871 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305876 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305881 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305885 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305890 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305894 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305898 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305902 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305906 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:04.308898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305911 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305916 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305920 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305924 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305929 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305933 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305957 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305961 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305965 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.305993 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306000 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306005 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306009 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306014 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306019 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306023 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306027 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306031 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306035 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306039 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:04.309409 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306043 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306048 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306052 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306056 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306060 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306064 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306069 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306073 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306077 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306081 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306084 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306089 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306094 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306098 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306102 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306106 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306113 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306120 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306125 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306129 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:04.309910 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306134 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306138 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306145 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306150 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306155 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306159 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:04.306163 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.306171 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.306887 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.309445 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.310214 2568 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.310313 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:04.310418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.310358 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:04.335638 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.335601 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:04.337874 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.337851 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:04.352550 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.352524 2568 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:04.358983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.358965 2568 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:04.360289 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.360269 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:04.362774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.362756 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:04.364491 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.364464 2568 fs.go:135] Filesystem UUIDs: map[02ecbef8-b4a9-4d40-bb1a-c62a32cb67c1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9c02d1a6-21be-459e-834c-5fe1e0e002e2:/dev/nvme0n1p4] Apr 22 17:34:04.364567 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.364490 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:04.372104 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.371989 2568 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:04.369872931 +0000 UTC m=+0.351518870 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3139771 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26c2cb03a572edb0a97c0ed5722364 SystemUUID:ec26c2cb-03a5-72ed-b0a9-7c0ed5722364 BootID:9a81e109-b2fb-4bfe-9aa5-8275bc0de9ba Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:36:db:fa:60:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:36:db:fa:60:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:de:cf:be:4a:57 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:04.372104 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372092 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:04.372240 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372177 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:04.372521 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372500 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:04.372659 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372522 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-169.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:04.372706 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372669 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:04.372706 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372678 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:04.372706 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.372691 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:04.373402 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.373392 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:04.374729 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.374720 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:04.374835 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.374826 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:04.377013 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377004 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:04.377054 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377017 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:04.377054 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377030 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:04.377054 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377040 2568 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:04.377054 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377050 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:04.377978 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377964 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:04.378016 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.377991 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:04.381081 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.381066 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:04.382430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.382414 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:04.384212 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384196 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384222 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384235 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384253 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384264 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384272 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:04.384285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384281 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:04.384499 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384289 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:04.384499 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384299 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:04.384499 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384308 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:04.384499 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384326 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:04.384499 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.384339 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:04.385098 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.385086 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:04.385157 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.385102 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:04.388730 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.388715 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:04.388822 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.388754 2568 server.go:1295] "Started kubelet" Apr 22 17:34:04.388869 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.388833 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-169.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:04.389003 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.388961 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:04.389057 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.389023 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:04.389057 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.389042 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-169.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:04.389138 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.389087 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:04.389185 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.389160 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:04.389702 ip-10-0-133-169 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:04.390244 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.390105 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:04.392252 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.392237 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:04.395516 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.395496 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:04.396072 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.396056 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:04.396776 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.396760 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:04.396895 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.396764 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.397165 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.397141 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:04.397165 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.397167 2568 factory.go:55] Registering systemd factory Apr 22 17:34:04.397297 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.397177 2568 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:04.397386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.397360 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:04.397452 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.397394 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:04.402058 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.397464 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-169.ec2.internal.18a8be422a8dfcb0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-169.ec2.internal,UID:ip-10-0-133-169.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-169.ec2.internal,},FirstTimestamp:2026-04-22 17:34:04.388727984 +0000 UTC m=+0.370373925,LastTimestamp:2026-04-22 17:34:04.388727984 +0000 UTC m=+0.370373925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-169.ec2.internal,}" Apr 22 17:34:04.402164 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.402054 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:04.402164 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.402074 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:04.402286 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.400455 2568 factory.go:153] Registering CRI-O factory Apr 22 17:34:04.402978 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.402293 2568 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:04.402978 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.402851 2568 factory.go:103] Registering Raw factory Apr 22 17:34:04.402978 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.402894 2568 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:04.403147 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.403098 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:04.404070 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.404043 2568 manager.go:319] Starting recovery of all containers Apr 22 17:34:04.404673 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.404641 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:04.404804 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.404778 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-169.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:34:04.409432 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.409408 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6fqf9" Apr 22 17:34:04.415254 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.415136 2568 manager.go:324] Recovery completed Apr 22 17:34:04.416775 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.416755 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6fqf9" Apr 22 17:34:04.419713 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.419699 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.422174 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422158 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.422236 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422192 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.422236 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422225 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.422746 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422731 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:04.422746 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422745 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:04.422869 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.422763 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:04.423892 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.423820 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-169.ec2.internal.18a8be422c8c5a1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-169.ec2.internal,UID:ip-10-0-133-169.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-169.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-169.ec2.internal,},FirstTimestamp:2026-04-22 17:34:04.422175262 +0000 UTC m=+0.403821201,LastTimestamp:2026-04-22 17:34:04.422175262 +0000 UTC m=+0.403821201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-169.ec2.internal,}" Apr 22 17:34:04.426138 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.426122 2568 policy_none.go:49] "None policy: Start" Apr 22 17:34:04.426200 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.426142 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:04.426200 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.426154 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.459472 2568 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.459505 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.459515 2568 server.go:85] "Starting device plugin registration server" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.459788 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.459800 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.459870 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.460060 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.460073 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.460612 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:04.484337 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.460653 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.526267 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.526222 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:04.527534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.527511 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:04.527667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.527543 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:04.527667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.527567 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:04.527667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.527576 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:04.527667 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.527616 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:04.531259 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.531230 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:04.560550 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.560482 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.561599 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.561583 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.561654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.561614 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.561654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.561629 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.561654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.561654 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.570279 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.570261 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.570335 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.570287 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-169.ec2.internal\": node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.583650 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.583625 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.628554 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.628522 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal"] Apr 22 17:34:04.628649 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.628608 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.630412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.630396 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.630461 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.630426 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.630461 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.630436 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.632901 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.632887 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.633096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.633130 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633112 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.633729 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633710 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.633789 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633742 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.633789 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633754 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.633789 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633710 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.633883 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633794 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.633883 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.633804 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.636175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.636158 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.636231 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.636194 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:04.637195 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.637178 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:04.637271 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.637205 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:04.637271 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.637223 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:04.653762 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.653732 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-169.ec2.internal\" not found" node="ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.657917 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.657897 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-169.ec2.internal\" not found" node="ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.684105 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.684074 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.704031 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.704005 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.704119 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.704036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.704119 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.704059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd997a444939df859d631949c5820f34-config\") pod \"kube-apiserver-proxy-ip-10-0-133-169.ec2.internal\" (UID: \"dd997a444939df859d631949c5820f34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.785054 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.785021 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.804419 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd997a444939df859d631949c5820f34-config\") pod \"kube-apiserver-proxy-ip-10-0-133-169.ec2.internal\" (UID: \"dd997a444939df859d631949c5820f34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.804487 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.804487 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.804552 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.804552 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dd997a444939df859d631949c5820f34-config\") pod \"kube-apiserver-proxy-ip-10-0-133-169.ec2.internal\" (UID: \"dd997a444939df859d631949c5820f34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.804552 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.804516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab7e5d80042d81dd59b2f6770610fc29-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal\" (UID: \"ab7e5d80042d81dd59b2f6770610fc29\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.885902 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.885818 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:04.955337 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.955300 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.960913 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:04.960895 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:04.986611 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:04.986568 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.087401 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:05.087356 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.188029 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:05.187919 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.225745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.225719 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:05.288657 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:05.288619 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.310267 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.310234 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:05.310416 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.310397 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:05.310464 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.310412 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:05.388904 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:05.388867 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.396017 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.395994 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:05.409355 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.409326 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:05.419878 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.419840 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:04 +0000 UTC" deadline="2027-12-18 16:59:09.78320199 +0000 UTC" Apr 22 17:34:05.419878 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.419877 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14519h25m4.363329959s" Apr 22 17:34:05.433505 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.433474 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tctsh" Apr 22 17:34:05.439184 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.439127 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tctsh" Apr 22 17:34:05.489597 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:05.489551 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-169.ec2.internal\" not found" Apr 22 17:34:05.499370 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.499328 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:05.596919 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.596896 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" Apr 22 17:34:05.600755 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:05.600728 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7e5d80042d81dd59b2f6770610fc29.slice/crio-7c73829020ed6472580da25d3ae271991d5a56f437539685c8b783f29b9a33d8 WatchSource:0}: Error finding container 7c73829020ed6472580da25d3ae271991d5a56f437539685c8b783f29b9a33d8: Status 404 returned error can't find the container with id 7c73829020ed6472580da25d3ae271991d5a56f437539685c8b783f29b9a33d8 Apr 22 17:34:05.600964 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:05.600926 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd997a444939df859d631949c5820f34.slice/crio-baa986bc4a3c6847deffc4e8c2bbd68bd12a49f0c161340ea94da4f21cfaa2c7 WatchSource:0}: Error finding container baa986bc4a3c6847deffc4e8c2bbd68bd12a49f0c161340ea94da4f21cfaa2c7: Status 404 returned error can't find the container with id baa986bc4a3c6847deffc4e8c2bbd68bd12a49f0c161340ea94da4f21cfaa2c7 Apr 22 17:34:05.607067 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.607044 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:05.613509 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.613492 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:05.615258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.615243 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" Apr 22 17:34:05.623083 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.623067 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:05.697870 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:05.697713 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:06.378389 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.378351 2568 apiserver.go:52] "Watching apiserver" Apr 22 17:34:06.387696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.387050 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:06.387696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.387582 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jckwf","openshift-network-operator/iptables-alerter-w7fhk","kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal","openshift-multus/multus-additional-cni-plugins-9nhgg","openshift-multus/network-metrics-daemon-cb4rf","openshift-network-diagnostics/network-check-target-6wq9d","openshift-ovn-kubernetes/ovnkube-node-vckbs","kube-system/konnectivity-agent-85kw2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2","openshift-cluster-node-tuning-operator/tuned-hspnj","openshift-dns/node-resolver-snnsr","openshift-image-registry/node-ca-c6mcn"] Apr 22 17:34:06.390846 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.390818 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.393073 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.393049 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.394353 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.394332 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.394474 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.394457 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.394569 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.394553 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-59wpn\"" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395498 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395635 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrv6x\"" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395785 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.395888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.395865 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:06.397770 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.397633 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:06.398224 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398105 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.398224 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398125 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.398224 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.398195 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:06.398474 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398294 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:06.398625 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398607 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8blbk\"" Apr 22 17:34:06.398687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398629 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:06.398687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.398649 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.400436 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.400419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:06.400515 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.400494 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:06.403083 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.403061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.405830 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.405809 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.406244 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.406066 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t8jx7\"" Apr 22 17:34:06.406244 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.406117 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:06.408376 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.408354 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.408471 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.408420 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:06.408530 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.408472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:06.408841 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.408821 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.409046 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.409028 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:06.409112 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.409067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.411832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.411443 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42f6v\"" Apr 22 17:34:06.411832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.411509 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:06.411832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.411647 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x6759\"" Apr 22 17:34:06.411832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.411690 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:06.411832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.411791 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:06.412156 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.412061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.416105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416080 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.416679 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-system-cni-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.416777 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-os-release\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.416777 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.416777 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416769 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.416872 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-device-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.416872 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c0bd56c9-8f3b-4147-87d8-99194e94569f-iptables-alerter-script\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.416928 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416874 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-socket-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.416986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.417023 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.416981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0bd56c9-8f3b-4147-87d8-99194e94569f-host-slash\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.417058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.417058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417052 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.417118 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417091 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.417150 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cnibin\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.417183 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.417215 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslrp\" (UniqueName: \"kubernetes.io/projected/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-kube-api-access-vslrp\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.417249 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmll\" (UniqueName: \"kubernetes.io/projected/71cf0a1f-9d8d-4195-9355-be900422df45-kube-api-access-hmmll\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.417285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417257 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-registration-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.417362 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-sys-fs\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.417423 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7gp\" (UniqueName: \"kubernetes.io/projected/4369eacb-a4c5-43c1-8201-a941859e9c99-kube-api-access-sc7gp\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.417472 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz9cr\" (UniqueName: \"kubernetes.io/projected/c0bd56c9-8f3b-4147-87d8-99194e94569f-kube-api-access-mz9cr\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.417472 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417444 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.417870 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.417853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.418132 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.418057 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnsbx\"" Apr 22 17:34:06.418546 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.418309 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.421601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.420778 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.421601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.421122 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:06.421601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.421142 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6fs2d\"" Apr 22 17:34:06.421601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.421471 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:06.421601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.421596 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.422316 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.422272 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:06.422558 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.422537 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sfkpl\"" Apr 22 17:34:06.440153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.440040 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:05 +0000 UTC" deadline="2027-11-22 20:47:28.722696644 +0000 UTC" Apr 22 17:34:06.440153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.440091 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13899h13m22.282609632s" Apr 22 17:34:06.498891 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.498864 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:06.517733 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssbh\" (UniqueName: \"kubernetes.io/projected/413cd8ee-53a3-4bf3-9251-11b54262bb83-kube-api-access-kssbh\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.517733 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517735 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cnibin\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517761 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-slash\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz269\" (UniqueName: \"kubernetes.io/projected/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-kube-api-access-dz269\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cnibin\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-cnibin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-socket-dir-parent\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.517988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.517988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-multus\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-systemd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-script-lib\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518105 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-hosts-file\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-conf-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysconfig\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518167 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/413cd8ee-53a3-4bf3-9251-11b54262bb83-host\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.518285 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-env-overrides\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzbq\" (UniqueName: \"kubernetes.io/projected/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-kube-api-access-lxzbq\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518332 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-etc-selinux\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518359 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-cni-binary-copy\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-k8s-cni-cncf-io\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmll\" (UniqueName: \"kubernetes.io/projected/71cf0a1f-9d8d-4195-9355-be900422df45-kube-api-access-hmmll\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-systemd-units\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-node-log\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-bin\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-netns\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518571 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-kubelet\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-netns\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-ovn\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-systemd\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-var-lib-kubelet\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.518696 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-os-release\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-daemon-config\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c0bd56c9-8f3b-4147-87d8-99194e94569f-iptables-alerter-script\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-tuned\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-tmp-dir\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-socket-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.518982 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-log-socket\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-netd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519027 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-config\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-modprobe-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-socket-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-sys\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.519117 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:06.519421 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-var-lib-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.519207 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:07.019168566 +0000 UTC m=+3.000814506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519236 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-host\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f47ae572-69e3-4737-9211-cfdba8868f24-agent-certs\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f47ae572-69e3-4737-9211-cfdba8868f24-konnectivity-ca\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c0bd56c9-8f3b-4147-87d8-99194e94569f-iptables-alerter-script\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-kubelet\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-registration-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-sys-fs\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-registration-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-sys-fs\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519431 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-kubernetes\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/413cd8ee-53a3-4bf3-9251-11b54262bb83-serviceca\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519429 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.520105 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519488 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-multus-certs\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519514 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-etc-kubernetes\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28zpd\" (UniqueName: \"kubernetes.io/projected/edd47fac-d678-4368-a928-a3ac85b7a40a-kube-api-access-28zpd\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-device-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-etc-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-lib-modules\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519647 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-tmp\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519710 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vslrp\" (UniqueName: \"kubernetes.io/projected/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-kube-api-access-vslrp\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-device-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4369eacb-a4c5-43c1-8201-a941859e9c99-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovn-node-metrics-cert\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519811 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-bin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7gp\" (UniqueName: \"kubernetes.io/projected/4369eacb-a4c5-43c1-8201-a941859e9c99-kube-api-access-sc7gp\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz9cr\" (UniqueName: \"kubernetes.io/projected/c0bd56c9-8f3b-4147-87d8-99194e94569f-kube-api-access-mz9cr\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.519981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-system-cni-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.520678 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-os-release\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520114 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-run\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520141 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-os-release\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520184 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-system-cni-dir\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-conf\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520256 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvm9z\" (UniqueName: \"kubernetes.io/projected/69032494-29d8-4c24-b5c0-06e8e6bb9787-kube-api-access-fvm9z\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-system-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-hostroot\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520347 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0bd56c9-8f3b-4147-87d8-99194e94569f-host-slash\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0bd56c9-8f3b-4147-87d8-99194e94569f-host-slash\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.521190 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.520591 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.526253 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.526227 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:06.529929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.529858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmll\" (UniqueName: \"kubernetes.io/projected/71cf0a1f-9d8d-4195-9355-be900422df45-kube-api-access-hmmll\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:06.529929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.529881 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslrp\" (UniqueName: \"kubernetes.io/projected/51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81-kube-api-access-vslrp\") pod \"multus-additional-cni-plugins-9nhgg\" (UID: \"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81\") " pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.530371 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.530335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz9cr\" (UniqueName: \"kubernetes.io/projected/c0bd56c9-8f3b-4147-87d8-99194e94569f-kube-api-access-mz9cr\") pod \"iptables-alerter-w7fhk\" (UID: \"c0bd56c9-8f3b-4147-87d8-99194e94569f\") " pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.530579 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.530539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7gp\" (UniqueName: \"kubernetes.io/projected/4369eacb-a4c5-43c1-8201-a941859e9c99-kube-api-access-sc7gp\") pod \"aws-ebs-csi-driver-node-lhsn2\" (UID: \"4369eacb-a4c5-43c1-8201-a941859e9c99\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.532718 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.532245 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" event={"ID":"dd997a444939df859d631949c5820f34","Type":"ContainerStarted","Data":"baa986bc4a3c6847deffc4e8c2bbd68bd12a49f0c161340ea94da4f21cfaa2c7"} Apr 22 17:34:06.535884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.535841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" event={"ID":"ab7e5d80042d81dd59b2f6770610fc29","Type":"ContainerStarted","Data":"7c73829020ed6472580da25d3ae271991d5a56f437539685c8b783f29b9a33d8"} Apr 22 17:34:06.621630 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvm9z\" (UniqueName: \"kubernetes.io/projected/69032494-29d8-4c24-b5c0-06e8e6bb9787-kube-api-access-fvm9z\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.621630 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-system-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-hostroot\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kssbh\" (UniqueName: \"kubernetes.io/projected/413cd8ee-53a3-4bf3-9251-11b54262bb83-kube-api-access-kssbh\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-slash\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz269\" (UniqueName: \"kubernetes.io/projected/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-kube-api-access-dz269\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-system-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-slash\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-hostroot\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-cni-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-cnibin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621850 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-socket-dir-parent\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-cnibin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.621873 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-multus\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-socket-dir-parent\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-systemd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-systemd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-script-lib\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-multus\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.621994 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-hosts-file\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-conf-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-conf-dir\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysconfig\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/413cd8ee-53a3-4bf3-9251-11b54262bb83-host\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-hosts-file\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysconfig\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-env-overrides\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/413cd8ee-53a3-4bf3-9251-11b54262bb83-host\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622155 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzbq\" (UniqueName: \"kubernetes.io/projected/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-kube-api-access-lxzbq\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.622453 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-cni-binary-copy\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-k8s-cni-cncf-io\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-systemd-units\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-node-log\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-node-log\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-systemd-units\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-k8s-cni-cncf-io\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-bin\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-netns\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-kubelet\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-netns\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-bin\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622428 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-netns\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-kubelet\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-ovn\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-systemd\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622525 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-netns\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622536 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-var-lib-kubelet\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.623258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622545 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-ovn\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-os-release\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-var-lib-kubelet\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-daemon-config\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-os-release\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-systemd\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-tuned\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-tmp-dir\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-log-socket\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-netd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-config\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-modprobe-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-log-socket\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-sys\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-var-lib-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-host\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f47ae572-69e3-4737-9211-cfdba8868f24-agent-certs\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f47ae572-69e3-4737-9211-cfdba8868f24-konnectivity-ca\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622973 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-kubelet\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.622999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-kubernetes\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623071 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/413cd8ee-53a3-4bf3-9251-11b54262bb83-serviceca\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-env-overrides\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-multus-certs\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-etc-kubernetes\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-multus-daemon-config\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623140 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-cni-netd\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623151 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28zpd\" (UniqueName: \"kubernetes.io/projected/edd47fac-d678-4368-a928-a3ac85b7a40a-kube-api-access-28zpd\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623180 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-etc-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-script-lib\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-lib-modules\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-tmp-dir\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-tmp\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovn-node-metrics-cert\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.624884 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-bin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623193 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-sys\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-etc-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-run\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edd47fac-d678-4368-a928-a3ac85b7a40a-cni-binary-copy\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-modprobe-d\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-conf\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-lib-modules\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623476 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-var-lib-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-host\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-run-multus-certs\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-etc-kubernetes\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623534 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-cni-bin\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edd47fac-d678-4368-a928-a3ac85b7a40a-host-var-lib-kubelet\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovnkube-config\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.625728 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-run-openvswitch\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/413cd8ee-53a3-4bf3-9251-11b54262bb83-serviceca\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-kubernetes\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-sysctl-conf\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.623821 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69032494-29d8-4c24-b5c0-06e8e6bb9787-run\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.624132 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f47ae572-69e3-4737-9211-cfdba8868f24-konnectivity-ca\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.626232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f47ae572-69e3-4737-9211-cfdba8868f24-agent-certs\") pod \"konnectivity-agent-85kw2\" (UID: \"f47ae572-69e3-4737-9211-cfdba8868f24\") " pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.626490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-etc-tuned\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.626612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.626529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-ovn-node-metrics-cert\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.626891 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.626797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69032494-29d8-4c24-b5c0-06e8e6bb9787-tmp\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.628713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.628656 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:06.628713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.628683 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:06.628713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.628696 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:06.628920 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:06.628786 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:07.128767563 +0000 UTC m=+3.110413490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:06.630173 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.630147 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzbq\" (UniqueName: \"kubernetes.io/projected/4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc-kube-api-access-lxzbq\") pod \"node-resolver-snnsr\" (UID: \"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc\") " pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.630757 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.630711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssbh\" (UniqueName: \"kubernetes.io/projected/413cd8ee-53a3-4bf3-9251-11b54262bb83-kube-api-access-kssbh\") pod \"node-ca-c6mcn\" (UID: \"413cd8ee-53a3-4bf3-9251-11b54262bb83\") " pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:06.631016 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.630995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvm9z\" (UniqueName: \"kubernetes.io/projected/69032494-29d8-4c24-b5c0-06e8e6bb9787-kube-api-access-fvm9z\") pod \"tuned-hspnj\" (UID: \"69032494-29d8-4c24-b5c0-06e8e6bb9787\") " pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.631415 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.631398 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz269\" (UniqueName: \"kubernetes.io/projected/1ffbd3d2-a676-442d-8a27-08dac0cc37fe-kube-api-access-dz269\") pod \"ovnkube-node-vckbs\" (UID: \"1ffbd3d2-a676-442d-8a27-08dac0cc37fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.631547 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.631527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28zpd\" (UniqueName: \"kubernetes.io/projected/edd47fac-d678-4368-a928-a3ac85b7a40a-kube-api-access-28zpd\") pod \"multus-jckwf\" (UID: \"edd47fac-d678-4368-a928-a3ac85b7a40a\") " pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.691419 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.691387 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:06.706720 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.706657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" Apr 22 17:34:06.716516 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.716481 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w7fhk" Apr 22 17:34:06.728494 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.728454 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" Apr 22 17:34:06.736253 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.736223 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-snnsr" Apr 22 17:34:06.752986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.752952 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:06.758719 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.758699 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:06.766362 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.766337 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jckwf" Apr 22 17:34:06.773033 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.773007 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hspnj" Apr 22 17:34:06.776623 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:06.776604 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c6mcn" Apr 22 17:34:07.026674 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.026603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:07.026804 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.026757 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:07.026856 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.026835 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.026818714 +0000 UTC m=+4.008464641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:07.227828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.227782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:07.228024 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.227933 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:07.228024 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.227966 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:07.228024 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.227984 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:07.228147 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:07.228054 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:08.228034472 +0000 UTC m=+4.209680421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:07.280651 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.280619 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf47ae572_69e3_4737_9211_cfdba8868f24.slice/crio-8c9d839f737e5a03517e4a040a7c539020f110dfe3e855a85bc13623792dd108 WatchSource:0}: Error finding container 8c9d839f737e5a03517e4a040a7c539020f110dfe3e855a85bc13623792dd108: Status 404 returned error can't find the container with id 8c9d839f737e5a03517e4a040a7c539020f110dfe3e855a85bc13623792dd108 Apr 22 17:34:07.282595 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.282558 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0bd56c9_8f3b_4147_87d8_99194e94569f.slice/crio-eadf09b4988810083a039f688165835c92e16f7ddc9d6b721980667b380b779d WatchSource:0}: Error finding container eadf09b4988810083a039f688165835c92e16f7ddc9d6b721980667b380b779d: Status 404 returned error can't find the container with id eadf09b4988810083a039f688165835c92e16f7ddc9d6b721980667b380b779d Apr 22 17:34:07.286340 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.286317 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4369eacb_a4c5_43c1_8201_a941859e9c99.slice/crio-7478a10a700224404fe8b84a23591783a0e1d7048fde94c75a642481f39a55b5 WatchSource:0}: Error finding container 7478a10a700224404fe8b84a23591783a0e1d7048fde94c75a642481f39a55b5: Status 404 returned error can't find the container with id 7478a10a700224404fe8b84a23591783a0e1d7048fde94c75a642481f39a55b5 Apr 22 17:34:07.306930 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.306898 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ffbd3d2_a676_442d_8a27_08dac0cc37fe.slice/crio-2e50e08eb709a90d7e8a5507438bd7189f81f0d6030323a94f2826dddd6cc68a WatchSource:0}: Error finding container 2e50e08eb709a90d7e8a5507438bd7189f81f0d6030323a94f2826dddd6cc68a: Status 404 returned error can't find the container with id 2e50e08eb709a90d7e8a5507438bd7189f81f0d6030323a94f2826dddd6cc68a Apr 22 17:34:07.307576 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.307549 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69032494_29d8_4c24_b5c0_06e8e6bb9787.slice/crio-00a0c04a07506a058c0ec9b3c0aaf5a2e9bc0d1efeb8b72faaaafd669b5909c9 WatchSource:0}: Error finding container 00a0c04a07506a058c0ec9b3c0aaf5a2e9bc0d1efeb8b72faaaafd669b5909c9: Status 404 returned error can't find the container with id 00a0c04a07506a058c0ec9b3c0aaf5a2e9bc0d1efeb8b72faaaafd669b5909c9 Apr 22 17:34:07.308628 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.308595 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c36ca16_a7c9_4be1_a04f_3fe4cbe924fc.slice/crio-b8ce2c10ebc4708067db4f0242f55dd4271ca875bac661edbe4f93813342b32b WatchSource:0}: Error finding container b8ce2c10ebc4708067db4f0242f55dd4271ca875bac661edbe4f93813342b32b: Status 404 returned error can't find the container with id b8ce2c10ebc4708067db4f0242f55dd4271ca875bac661edbe4f93813342b32b Apr 22 17:34:07.309639 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.309585 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c5cdbf_e98b_4e0a_bdb5_c1cdbe7f1a81.slice/crio-3cdddd7e2e8da60dfeea89381d93f362ad7264576e1e93f1b16e40d0da50bc9d WatchSource:0}: Error finding container 3cdddd7e2e8da60dfeea89381d93f362ad7264576e1e93f1b16e40d0da50bc9d: Status 404 returned error can't find the container with id 3cdddd7e2e8da60dfeea89381d93f362ad7264576e1e93f1b16e40d0da50bc9d Apr 22 17:34:07.310239 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.310219 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod413cd8ee_53a3_4bf3_9251_11b54262bb83.slice/crio-78cce64b58e0f6b312e40f1fdca922dd431fced3d4b73073d0bf670e2b478b8f WatchSource:0}: Error finding container 78cce64b58e0f6b312e40f1fdca922dd431fced3d4b73073d0bf670e2b478b8f: Status 404 returned error can't find the container with id 78cce64b58e0f6b312e40f1fdca922dd431fced3d4b73073d0bf670e2b478b8f Apr 22 17:34:07.311100 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:07.311074 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd47fac_d678_4368_a928_a3ac85b7a40a.slice/crio-5b6ca8eb72f4c0ab32498e1c739922f2a3818f8c316f24b5e338105c347edf5b WatchSource:0}: Error finding container 5b6ca8eb72f4c0ab32498e1c739922f2a3818f8c316f24b5e338105c347edf5b: Status 404 returned error can't find the container with id 5b6ca8eb72f4c0ab32498e1c739922f2a3818f8c316f24b5e338105c347edf5b Apr 22 17:34:07.441329 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.441131 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:05 +0000 UTC" deadline="2027-11-29 15:58:33.718759012 +0000 UTC" Apr 22 17:34:07.441329 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.441318 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14062h24m26.277444655s" Apr 22 17:34:07.538892 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.538771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-snnsr" event={"ID":"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc","Type":"ContainerStarted","Data":"b8ce2c10ebc4708067db4f0242f55dd4271ca875bac661edbe4f93813342b32b"} Apr 22 17:34:07.539709 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.539683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerStarted","Data":"3cdddd7e2e8da60dfeea89381d93f362ad7264576e1e93f1b16e40d0da50bc9d"} Apr 22 17:34:07.540664 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.540643 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hspnj" event={"ID":"69032494-29d8-4c24-b5c0-06e8e6bb9787","Type":"ContainerStarted","Data":"00a0c04a07506a058c0ec9b3c0aaf5a2e9bc0d1efeb8b72faaaafd669b5909c9"} Apr 22 17:34:07.542214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.542195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"2e50e08eb709a90d7e8a5507438bd7189f81f0d6030323a94f2826dddd6cc68a"} Apr 22 17:34:07.544207 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.544188 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" event={"ID":"4369eacb-a4c5-43c1-8201-a941859e9c99","Type":"ContainerStarted","Data":"7478a10a700224404fe8b84a23591783a0e1d7048fde94c75a642481f39a55b5"} Apr 22 17:34:07.545113 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.545092 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w7fhk" event={"ID":"c0bd56c9-8f3b-4147-87d8-99194e94569f","Type":"ContainerStarted","Data":"eadf09b4988810083a039f688165835c92e16f7ddc9d6b721980667b380b779d"} Apr 22 17:34:07.547398 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.547379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-85kw2" event={"ID":"f47ae572-69e3-4737-9211-cfdba8868f24","Type":"ContainerStarted","Data":"8c9d839f737e5a03517e4a040a7c539020f110dfe3e855a85bc13623792dd108"} Apr 22 17:34:07.549189 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.549170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" event={"ID":"dd997a444939df859d631949c5820f34","Type":"ContainerStarted","Data":"2e57091a5e519f9fd480ff9389acfd17b0dda51d08ff185b724076b267d0363b"} Apr 22 17:34:07.550229 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.550209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jckwf" event={"ID":"edd47fac-d678-4368-a928-a3ac85b7a40a","Type":"ContainerStarted","Data":"5b6ca8eb72f4c0ab32498e1c739922f2a3818f8c316f24b5e338105c347edf5b"} Apr 22 17:34:07.551739 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:07.551716 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c6mcn" event={"ID":"413cd8ee-53a3-4bf3-9251-11b54262bb83","Type":"ContainerStarted","Data":"78cce64b58e0f6b312e40f1fdca922dd431fced3d4b73073d0bf670e2b478b8f"} Apr 22 17:34:08.033984 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.033673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:08.035156 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.035130 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:08.035270 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.035221 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:10.035200566 +0000 UTC m=+6.016846496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:08.156617 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.156393 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:08.236498 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.235846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:08.236498 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.236037 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:08.236498 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.236062 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:08.236498 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.236075 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:08.236498 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.236138 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:10.236116671 +0000 UTC m=+6.217762621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:08.540538 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.540462 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:08.540982 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.540580 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:08.541111 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.541014 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:08.541159 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:08.541113 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:08.579576 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.579537 2568 generic.go:358] "Generic (PLEG): container finished" podID="ab7e5d80042d81dd59b2f6770610fc29" containerID="2d0ab2a84d52fcb8191076a3d3722a970158c3a190f2fc9ba5e0cdfc631f964e" exitCode=0 Apr 22 17:34:08.579756 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.579649 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" event={"ID":"ab7e5d80042d81dd59b2f6770610fc29","Type":"ContainerDied","Data":"2d0ab2a84d52fcb8191076a3d3722a970158c3a190f2fc9ba5e0cdfc631f964e"} Apr 22 17:34:08.594288 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:08.593378 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-169.ec2.internal" podStartSLOduration=3.593362842 podStartE2EDuration="3.593362842s" podCreationTimestamp="2026-04-22 17:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:07.562861885 +0000 UTC m=+3.544507835" watchObservedRunningTime="2026-04-22 17:34:08.593362842 +0000 UTC m=+4.575008791" Apr 22 17:34:09.589683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:09.589644 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" event={"ID":"ab7e5d80042d81dd59b2f6770610fc29","Type":"ContainerStarted","Data":"ea26b89151f6a5b940fce19ee354da7dc0ad7ee809eca578f38ec978dbe8bacd"} Apr 22 17:34:10.052761 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:10.052721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:10.053118 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.052869 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:10.053118 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.052951 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:14.05291532 +0000 UTC m=+10.034561250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:10.255200 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:10.255156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:10.255377 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.255325 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:10.255377 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.255349 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:10.255377 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.255371 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:10.255542 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.255444 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:14.255414812 +0000 UTC m=+10.237060755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:10.528831 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:10.528748 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:10.529021 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.528886 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:10.529021 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:10.528990 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:10.529151 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:10.529091 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:12.170319 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.170260 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-169.ec2.internal" podStartSLOduration=7.170239199 podStartE2EDuration="7.170239199s" podCreationTimestamp="2026-04-22 17:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:09.602747031 +0000 UTC m=+5.584392981" watchObservedRunningTime="2026-04-22 17:34:12.170239199 +0000 UTC m=+8.151885148" Apr 22 17:34:12.172972 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.172919 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zrp9s"] Apr 22 17:34:12.179805 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.179783 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.179921 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.179868 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:12.275862 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.275822 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.276063 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.275924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-dbus\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.276063 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.275994 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-kubelet-config\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.376682 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.376631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-dbus\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.376861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.376712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-kubelet-config\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.376861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.376756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.377017 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.376864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-dbus\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.377017 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.376905 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:12.377017 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.376956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-kubelet-config\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.377017 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.377004 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:12.876984135 +0000 UTC m=+8.858630069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:12.530702 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.528134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:12.530702 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.528242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:12.530702 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.528498 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:12.530702 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.528556 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:12.881096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:12.881006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:12.881243 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.881140 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:12.881243 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:12.881200 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:13.881181994 +0000 UTC m=+9.862827921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:13.889963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:13.889913 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:13.890432 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:13.890085 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:13.890432 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:13.890149 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:15.890130764 +0000 UTC m=+11.871776696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:14.091980 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:14.091921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:14.092168 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.092065 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:14.092168 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.092144 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:22.092123085 +0000 UTC m=+18.073769020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:14.294158 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:14.294074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:14.294326 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.294271 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:14.294326 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.294294 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:14.294326 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.294307 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:14.294485 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.294361 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:22.294343235 +0000 UTC m=+18.275989165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:14.530837 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:14.530799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:14.531038 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:14.530925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:14.531038 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.530933 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:14.531038 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:14.530959 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:14.531215 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.531035 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:14.531215 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:14.531133 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:15.906698 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:15.906662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:15.907198 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:15.906803 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:15.907198 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:15.906881 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:19.906861684 +0000 UTC m=+15.888507631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:16.527760 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:16.527730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:16.527930 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:16.527730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:16.527930 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:16.527873 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:16.528057 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:16.527962 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:16.528057 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:16.527734 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:16.528057 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:16.528026 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:18.528318 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:18.528288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:18.528740 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:18.528287 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:18.528740 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:18.528424 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:18.528740 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:18.528489 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:18.528740 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:18.528292 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:18.528740 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:18.528587 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:19.935903 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:19.935873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:19.936315 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:19.936024 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:19.936315 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:19.936090 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:27.936071021 +0000 UTC m=+23.917716952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:20.528032 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:20.528000 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:20.528032 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:20.528040 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:20.528280 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:20.528000 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:20.528280 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:20.528155 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:20.528280 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:20.528192 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:20.528280 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:20.528265 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:22.151757 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:22.151711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:22.152187 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.151872 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:22.152187 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.151950 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.1519231 +0000 UTC m=+34.133569027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:22.353215 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:22.353183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:22.353389 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.353369 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:22.353445 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.353394 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:22.353445 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.353405 2568 projected.go:194] Error preparing data for projected volume kube-api-access-5m6v9 for pod openshift-network-diagnostics/network-check-target-6wq9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:22.353539 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.353457 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9 podName:bca926f3-b1c0-45e1-9eb2-e0d0fc51b178 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.353440745 +0000 UTC m=+34.335086671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m6v9" (UniqueName: "kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9") pod "network-check-target-6wq9d" (UID: "bca926f3-b1c0-45e1-9eb2-e0d0fc51b178") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:22.528168 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:22.528097 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:22.528168 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:22.528118 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:22.528352 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:22.528097 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:22.528352 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.528191 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:22.528352 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.528278 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:22.528352 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:22.528342 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:24.546774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.546236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:24.546774 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:24.546505 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:24.546774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.546618 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:24.546774 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:24.546732 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:24.547319 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.546791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:24.547319 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:24.546898 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:24.618714 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.618532 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hspnj" event={"ID":"69032494-29d8-4c24-b5c0-06e8e6bb9787","Type":"ContainerStarted","Data":"53c4d16e20b3e8179bd15e2bb99e6e3085205d3ea65e9f37d9bb16afac0c9c9d"} Apr 22 17:34:24.621139 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.620853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-85kw2" event={"ID":"f47ae572-69e3-4737-9211-cfdba8868f24","Type":"ContainerStarted","Data":"94989becbee85ee420a404a1687ccfff6817538bbda55ce5900917a9cb762a56"} Apr 22 17:34:24.638295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.638255 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hspnj" podStartSLOduration=3.597232263 podStartE2EDuration="20.638238908s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.309331461 +0000 UTC m=+3.290977388" lastFinishedPulling="2026-04-22 17:34:24.350338101 +0000 UTC m=+20.331984033" observedRunningTime="2026-04-22 17:34:24.638045358 +0000 UTC m=+20.619691308" watchObservedRunningTime="2026-04-22 17:34:24.638238908 +0000 UTC m=+20.619884858" Apr 22 17:34:24.653443 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.653405 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-85kw2" podStartSLOduration=3.944144996 podStartE2EDuration="20.653391962s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.282829786 +0000 UTC m=+3.264475713" lastFinishedPulling="2026-04-22 17:34:23.992076738 +0000 UTC m=+19.973722679" observedRunningTime="2026-04-22 17:34:24.653020086 +0000 UTC m=+20.634666035" watchObservedRunningTime="2026-04-22 17:34:24.653391962 +0000 UTC m=+20.635037916" Apr 22 17:34:24.658142 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.657553 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:24.659123 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:24.659107 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:25.624402 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624243 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624646 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ffbd3d2-a676-442d-8a27-08dac0cc37fe" containerID="2607fb981c03c6103ab45b7457ca6bf2847a18315823f1d9afec504b391fa500" exitCode=1 Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624712 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"d3773bf1274c30def51b1d9de26c0cb29c6d047e8b7bc210dbcbbca1b6c79456"} Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"3a24b87132eba2cc4357357ada6141c67834ae7d054902184c1668e15d24fec0"} Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"94480525f42423561020dfd6b524f9d38f9e6eea876c17db2ab6458bf1530d7c"} Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624759 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"a2663e68d21a441cd7a0796d084089b29c92515a18ce6a05b92e86e27947ddd3"} Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerDied","Data":"2607fb981c03c6103ab45b7457ca6bf2847a18315823f1d9afec504b391fa500"} Apr 22 17:34:25.625058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.624777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"a4b08927aa5004dcd9b2d1202d0ed822f39348b1de19027afdf3bfa4c2f74ecf"} Apr 22 17:34:25.625792 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.625773 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" event={"ID":"4369eacb-a4c5-43c1-8201-a941859e9c99","Type":"ContainerStarted","Data":"da5bc376c7e33593f2ea028fb33d960cab126c5941d71b6fe484c5266d948586"} Apr 22 17:34:25.626838 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.626818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jckwf" event={"ID":"edd47fac-d678-4368-a928-a3ac85b7a40a","Type":"ContainerStarted","Data":"5e3cd9d3a968739468b2d94150c9a208eb06803278b23c2f4099307bfb99da84"} Apr 22 17:34:25.628175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.628156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c6mcn" event={"ID":"413cd8ee-53a3-4bf3-9251-11b54262bb83","Type":"ContainerStarted","Data":"b437096d241a9fe70cc8366e103c15bdec156b34bfd554d84f90cf599f73ad33"} Apr 22 17:34:25.629396 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.629375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-snnsr" event={"ID":"4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc","Type":"ContainerStarted","Data":"024b7bec13ad8d912034271123dcbfbec114ccb9bcfbbc1ded550e227b1d8aa0"} Apr 22 17:34:25.630669 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.630637 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="8a9836a3c345706ca4dcbcaf7ae4c5a7c008cb2bc061d22101ec6d598ddc7eb1" exitCode=0 Apr 22 17:34:25.630755 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.630719 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"8a9836a3c345706ca4dcbcaf7ae4c5a7c008cb2bc061d22101ec6d598ddc7eb1"} Apr 22 17:34:25.631182 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.631136 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:25.631807 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.631792 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-85kw2" Apr 22 17:34:25.644018 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.643985 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jckwf" podStartSLOduration=4.592680953 podStartE2EDuration="21.643975724s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.314580107 +0000 UTC m=+3.296226048" lastFinishedPulling="2026-04-22 17:34:24.36587488 +0000 UTC m=+20.347520819" observedRunningTime="2026-04-22 17:34:25.643800402 +0000 UTC m=+21.625446372" watchObservedRunningTime="2026-04-22 17:34:25.643975724 +0000 UTC m=+21.625621673" Apr 22 17:34:25.683913 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.683844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-c6mcn" podStartSLOduration=12.652725389 podStartE2EDuration="21.683834424s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.314887919 +0000 UTC m=+3.296533862" lastFinishedPulling="2026-04-22 17:34:16.345996966 +0000 UTC m=+12.327642897" observedRunningTime="2026-04-22 17:34:25.683643456 +0000 UTC m=+21.665289406" watchObservedRunningTime="2026-04-22 17:34:25.683834424 +0000 UTC m=+21.665480374" Apr 22 17:34:25.696371 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:25.696329 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-snnsr" podStartSLOduration=4.663288011 podStartE2EDuration="21.696320078s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.314621076 +0000 UTC m=+3.296267009" lastFinishedPulling="2026-04-22 17:34:24.347653128 +0000 UTC m=+20.329299076" observedRunningTime="2026-04-22 17:34:25.696082851 +0000 UTC m=+21.677728808" watchObservedRunningTime="2026-04-22 17:34:25.696320078 +0000 UTC m=+21.677966026" Apr 22 17:34:26.113134 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.113108 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:34:26.471836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.471736 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:34:26.113127725Z","UUID":"88a5b8f1-4992-4c32-988d-9239d76b86d7","Handler":null,"Name":"","Endpoint":""} Apr 22 17:34:26.473395 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.473376 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:34:26.473520 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.473404 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:34:26.528750 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.528722 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:26.528894 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.528728 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:26.528894 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:26.528860 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:26.529044 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:26.528955 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:26.529044 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.528730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:26.529141 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:26.529053 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:26.633739 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.633705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" event={"ID":"4369eacb-a4c5-43c1-8201-a941859e9c99","Type":"ContainerStarted","Data":"9d8e784b2d30ea94363f5a2afe8426dd9a8f855b43c19e9396e406d31c3a9e66"} Apr 22 17:34:26.635061 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.635033 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w7fhk" event={"ID":"c0bd56c9-8f3b-4147-87d8-99194e94569f","Type":"ContainerStarted","Data":"8a224284f6db6102992e31fa27b9601309e9c2b3499672fbb88e17fd6ffaafb2"} Apr 22 17:34:26.659151 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:26.659109 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-w7fhk" podStartSLOduration=5.638324646 podStartE2EDuration="22.659097121s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.28460694 +0000 UTC m=+3.266252876" lastFinishedPulling="2026-04-22 17:34:24.305379411 +0000 UTC m=+20.287025351" observedRunningTime="2026-04-22 17:34:26.658643271 +0000 UTC m=+22.640289219" watchObservedRunningTime="2026-04-22 17:34:26.659097121 +0000 UTC m=+22.640743070" Apr 22 17:34:27.641392 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:27.641172 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:34:27.641822 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:27.641727 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"6f01772901e1179d0981f43a170c31bee178a6cc3029dbc2a7fd1290c54b04fd"} Apr 22 17:34:27.992046 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:27.991967 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:27.992200 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:27.992105 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:27.992200 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:27.992185 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret podName:b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:43.992164246 +0000 UTC m=+39.973810177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret") pod "global-pull-secret-syncer-zrp9s" (UID: "b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:28.528644 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:28.528611 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:28.528644 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:28.528635 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:28.528872 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:28.528610 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:28.528872 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:28.528755 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:28.528872 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:28.528818 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:28.529013 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:28.528895 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:28.645247 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:28.645209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" event={"ID":"4369eacb-a4c5-43c1-8201-a941859e9c99","Type":"ContainerStarted","Data":"9142aca3a36e91d8d5bb03f0509452ce4d55f648edc129e139eda17ae571db3d"} Apr 22 17:34:28.664898 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:28.664859 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lhsn2" podStartSLOduration=4.444836959 podStartE2EDuration="24.664845806s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.305662202 +0000 UTC m=+3.287308150" lastFinishedPulling="2026-04-22 17:34:27.525671067 +0000 UTC m=+23.507316997" observedRunningTime="2026-04-22 17:34:28.663823269 +0000 UTC m=+24.645469221" watchObservedRunningTime="2026-04-22 17:34:28.664845806 +0000 UTC m=+24.646491755" Apr 22 17:34:30.528055 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.527879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:30.528670 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.527894 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:30.528670 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:30.528118 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:30.528670 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.527894 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:30.528670 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:30.528194 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:30.528670 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:30.528277 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:30.650132 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.650095 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="7da0c8b65e54bad9d2ef50701a3c8b550426006807a20c58d37d29df16af8fcf" exitCode=0 Apr 22 17:34:30.650276 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.650175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"7da0c8b65e54bad9d2ef50701a3c8b550426006807a20c58d37d29df16af8fcf"} Apr 22 17:34:30.653223 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.653198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:34:30.653536 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.653517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"abaa06fe5552a5bb4da8eba9aadf565e4b0e579d6143e23e8d87e45727b854bf"} Apr 22 17:34:30.653744 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.653727 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:30.653816 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.653755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:30.653931 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.653913 2568 scope.go:117] "RemoveContainer" containerID="2607fb981c03c6103ab45b7457ca6bf2847a18315823f1d9afec504b391fa500" Apr 22 17:34:30.668210 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:30.668193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:31.656734 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.656669 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="9c2eacccc16c97e9986fb08bbeeaa57f2d4f803e82039df92be6c741feffa477" exitCode=0 Apr 22 17:34:31.657256 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.656745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"9c2eacccc16c97e9986fb08bbeeaa57f2d4f803e82039df92be6c741feffa477"} Apr 22 17:34:31.660103 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.660076 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:34:31.660446 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.660422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" event={"ID":"1ffbd3d2-a676-442d-8a27-08dac0cc37fe","Type":"ContainerStarted","Data":"384809bcc064251bb86a34985ed695a01e02dbd10e8175e629409ca6e1b3c6f4"} Apr 22 17:34:31.660704 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.660686 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:31.674145 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.674124 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:34:31.703374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.703347 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6wq9d"] Apr 22 17:34:31.703471 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.703447 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:31.703537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:31.703520 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:31.706754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.706734 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cb4rf"] Apr 22 17:34:31.706839 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.706819 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:31.706901 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:31.706887 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:31.712606 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.712566 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" podStartSLOduration=10.627431397 podStartE2EDuration="27.712555776s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.308796384 +0000 UTC m=+3.290442325" lastFinishedPulling="2026-04-22 17:34:24.393920773 +0000 UTC m=+20.375566704" observedRunningTime="2026-04-22 17:34:31.712417856 +0000 UTC m=+27.694063805" watchObservedRunningTime="2026-04-22 17:34:31.712555776 +0000 UTC m=+27.694201718" Apr 22 17:34:31.714847 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.714827 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zrp9s"] Apr 22 17:34:31.714952 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:31.714901 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:31.715024 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:31.714975 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:32.663586 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:32.663506 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="107c1d46fec35b00000d504ba77efdd6267d90bafdceda0c6de6b2a521a7e997" exitCode=0 Apr 22 17:34:32.663952 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:32.663593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"107c1d46fec35b00000d504ba77efdd6267d90bafdceda0c6de6b2a521a7e997"} Apr 22 17:34:33.528321 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:33.528216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:33.528321 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:33.528227 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:33.528575 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:33.528346 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:33.528575 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:33.528439 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:33.528575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:33.528462 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:33.528820 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:33.528569 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:35.528624 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:35.528585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:35.529215 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:35.528585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:35.529215 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:35.528728 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:34:35.529215 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:35.528585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:35.529215 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:35.528788 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-zrp9s" podUID="b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc" Apr 22 17:34:35.529215 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:35.528855 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6wq9d" podUID="bca926f3-b1c0-45e1-9eb2-e0d0fc51b178" Apr 22 17:34:37.394819 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.394778 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-169.ec2.internal" event="NodeReady" Apr 22 17:34:37.395347 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.394964 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:34:37.449853 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.449761 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:34:37.454295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.454266 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv"] Apr 22 17:34:37.454465 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.454443 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.457084 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.457041 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp"] Apr 22 17:34:37.457508 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.457395 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.459648 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.459630 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx"] Apr 22 17:34:37.459793 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.459774 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.462575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462244 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:34:37.462575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462273 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dsn92\"" Apr 22 17:34:37.462575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462249 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:34:37.462575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462378 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:34:37.462575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462412 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 17:34:37.462875 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462592 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n"] Apr 22 17:34:37.462875 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462844 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 17:34:37.463028 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.462876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.463806 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.463747 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:34:37.463806 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.463795 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:34:37.463972 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.463856 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:34:37.464198 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.464177 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:34:37.465917 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.465874 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.467058 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.467040 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rmqdq\"" Apr 22 17:34:37.467636 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.467618 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:34:37.467983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.467962 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-cbhgc\"" Apr 22 17:34:37.469206 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.469188 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:34:37.470072 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.470051 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 17:34:37.470179 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.470092 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 17:34:37.470179 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.470123 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 17:34:37.470393 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.470336 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 17:34:37.481438 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.481416 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx"] Apr 22 17:34:37.491597 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.491301 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:34:37.492398 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.492373 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp"] Apr 22 17:34:37.493080 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.493061 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv"] Apr 22 17:34:37.498665 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.498642 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5n8fn"] Apr 22 17:34:37.504367 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.504344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.504476 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.504381 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dp46v"] Apr 22 17:34:37.507156 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.507137 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.509467 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.509449 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:34:37.509781 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.509761 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:34:37.509877 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.509860 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:34:37.509967 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.509880 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:34:37.510171 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.510149 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:34:37.510259 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.510241 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:34:37.510317 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.510260 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:34:37.517234 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.517208 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5n8fn"] Apr 22 17:34:37.525361 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.525320 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n"] Apr 22 17:34:37.528009 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.527926 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:37.528128 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.528074 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:37.528796 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.528367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:37.531287 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.531267 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:34:37.532672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.532323 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:34:37.532672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.532387 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:34:37.532672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.532390 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:34:37.532672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.532491 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lgj26\"" Apr 22 17:34:37.532672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.532592 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:34:37.534086 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.534062 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp46v"] Apr 22 17:34:37.567116 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567071 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567298 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567298 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z96pm\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567298 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2aa7890-fb2e-4dbf-939b-479f29da74e1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.567298 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dd6771b1-2f0a-4818-8710-c7d543723c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-klusterlet-config\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.567534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567421 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxxx\" (UniqueName: \"kubernetes.io/projected/e2aa7890-fb2e-4dbf-939b-479f29da74e1-kube-api-access-kcxxx\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.567534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567644 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47363b61-021b-4c33-bee9-3f7fb1dc9969-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.567813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.567813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.567963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-tmp\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.567963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqgr8\" (UniqueName: \"kubernetes.io/projected/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-kube-api-access-pqgr8\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.567963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntwk\" (UniqueName: \"kubernetes.io/projected/dd6771b1-2f0a-4818-8710-c7d543723c88-kube-api-access-xntwk\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.567963 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.567910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.669197 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqgr8\" (UniqueName: \"kubernetes.io/projected/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-kube-api-access-pqgr8\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.669386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-tmp\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.669386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.669386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2aa7890-fb2e-4dbf-939b-479f29da74e1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.669386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z96pm\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.669594 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669530 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dd6771b1-2f0a-4818-8710-c7d543723c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.669594 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669568 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.669688 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.669688 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.669688 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrw8\" (UniqueName: \"kubernetes.io/projected/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-kube-api-access-vkrw8\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.669836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47363b61-021b-4c33-bee9-3f7fb1dc9969-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.669836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.669836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.669836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-config-volume\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.670049 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.669877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-tmp\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/dd6771b1-2f0a-4818-8710-c7d543723c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670628 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xntwk\" (UniqueName: \"kubernetes.io/projected/dd6771b1-2f0a-4818-8710-c7d543723c88-kube-api-access-xntwk\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.670673 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670658 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.671049 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.670919 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/47363b61-021b-4c33-bee9-3f7fb1dc9969-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:37.671104 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.671070 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:37.671104 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.671087 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:37.671214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671143 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.671214 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.671150 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.171132488 +0000 UTC m=+34.152778417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:37.671214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49p68\" (UniqueName: \"kubernetes.io/projected/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-kube-api-access-49p68\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.671366 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.671366 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.671308 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:37.671366 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.671520 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.671414 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.171398577 +0000 UTC m=+34.153044504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:37.671575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-klusterlet-config\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.671575 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671540 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-tmp-dir\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.671677 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxxx\" (UniqueName: \"kubernetes.io/projected/e2aa7890-fb2e-4dbf-939b-479f29da74e1-kube-api-access-kcxxx\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.671677 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671604 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.671677 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.671826 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.671709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.674535 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.674507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.674659 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.674507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.674659 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.674606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2aa7890-fb2e-4dbf-939b-479f29da74e1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.675151 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.675127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.675299 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.675172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.675401 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.675243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/dd6771b1-2f0a-4818-8710-c7d543723c88-ca\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.675658 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.675617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-klusterlet-config\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.676477 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.676454 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.680750 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.680725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqgr8\" (UniqueName: \"kubernetes.io/projected/4b0b3cbd-6bf3-41d3-a5a0-f236662c2033-kube-api-access-pqgr8\") pod \"klusterlet-addon-workmgr-8487cc97d4-fq6gp\" (UID: \"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.681203 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.681179 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxxx\" (UniqueName: \"kubernetes.io/projected/e2aa7890-fb2e-4dbf-939b-479f29da74e1-kube-api-access-kcxxx\") pod \"managed-serviceaccount-addon-agent-6769747df7-s6hdx\" (UID: \"e2aa7890-fb2e-4dbf-939b-479f29da74e1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.684614 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.684583 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z96pm\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.684726 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.684623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:37.686446 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.686425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntwk\" (UniqueName: \"kubernetes.io/projected/dd6771b1-2f0a-4818-8710-c7d543723c88-kube-api-access-xntwk\") pod \"cluster-proxy-proxy-agent-fd854664f-fn89n\" (UID: \"dd6771b1-2f0a-4818-8710-c7d543723c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:37.773147 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrw8\" (UniqueName: \"kubernetes.io/projected/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-kube-api-access-vkrw8\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.773147 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-config-volume\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.773425 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773151 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.773425 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.773425 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49p68\" (UniqueName: \"kubernetes.io/projected/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-kube-api-access-49p68\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.773425 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-tmp-dir\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.773425 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.773422 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:37.773713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.773494 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.273475283 +0000 UTC m=+34.255121218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:34:37.773713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.773503 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:37.773713 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:37.773556 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:38.273543752 +0000 UTC m=+34.255189679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:37.773713 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-tmp-dir\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.773925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.773798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-config-volume\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.781403 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.781365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:37.786799 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.786769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrw8\" (UniqueName: \"kubernetes.io/projected/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-kube-api-access-vkrw8\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:37.787488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.787464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49p68\" (UniqueName: \"kubernetes.io/projected/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-kube-api-access-49p68\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:37.794476 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.794429 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:34:37.804321 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:37.804286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:34:38.177041 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.177000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:38.177227 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.177052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:34:38.177227 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.177092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:38.177227 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177182 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:38.177227 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177224 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:38.177422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177239 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:38.177422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177259 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.177238935 +0000 UTC m=+35.158884880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:38.177422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177288 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.177271309 +0000 UTC m=+35.158917245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:38.177422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177339 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:34:38.177422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.177390 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:10.177375273 +0000 UTC m=+66.159021205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : secret "metrics-daemon-secret" not found Apr 22 17:34:38.277881 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.277844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:38.278043 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.277893 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:38.278043 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.278000 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:38.278043 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.278003 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:38.278142 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.278054 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.278039041 +0000 UTC m=+35.259684968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:34:38.278142 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:38.278069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:39.278062131 +0000 UTC m=+35.259708058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:38.380325 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.379716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:38.383228 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.383182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6v9\" (UniqueName: \"kubernetes.io/projected/bca926f3-b1c0-45e1-9eb2-e0d0fc51b178-kube-api-access-5m6v9\") pod \"network-check-target-6wq9d\" (UID: \"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178\") " pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:38.451406 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.451366 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:38.583819 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.583735 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx"] Apr 22 17:34:38.586356 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.586323 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp"] Apr 22 17:34:38.597583 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.597559 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n"] Apr 22 17:34:38.619366 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.619340 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6wq9d"] Apr 22 17:34:38.640328 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:38.640241 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2aa7890_fb2e_4dbf_939b_479f29da74e1.slice/crio-1a412f923a024e3dfec83e69c5e8fb511f119eb64625241b4864e7bf57913f14 WatchSource:0}: Error finding container 1a412f923a024e3dfec83e69c5e8fb511f119eb64625241b4864e7bf57913f14: Status 404 returned error can't find the container with id 1a412f923a024e3dfec83e69c5e8fb511f119eb64625241b4864e7bf57913f14 Apr 22 17:34:38.640653 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:38.640633 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0b3cbd_6bf3_41d3_a5a0_f236662c2033.slice/crio-889295e3ece5e0a29171139efbcbf671423687ef3c3eaf1cffd836a18a304047 WatchSource:0}: Error finding container 889295e3ece5e0a29171139efbcbf671423687ef3c3eaf1cffd836a18a304047: Status 404 returned error can't find the container with id 889295e3ece5e0a29171139efbcbf671423687ef3c3eaf1cffd836a18a304047 Apr 22 17:34:38.641612 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:38.641579 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6771b1_2f0a_4818_8710_c7d543723c88.slice/crio-c7c92fa69c5a6e40e17fa21a610a8bb880d50ed7ebfb305ecabd243c76ff2671 WatchSource:0}: Error finding container c7c92fa69c5a6e40e17fa21a610a8bb880d50ed7ebfb305ecabd243c76ff2671: Status 404 returned error can't find the container with id c7c92fa69c5a6e40e17fa21a610a8bb880d50ed7ebfb305ecabd243c76ff2671 Apr 22 17:34:38.642556 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:38.642535 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbca926f3_b1c0_45e1_9eb2_e0d0fc51b178.slice/crio-5015e00addea3bef9e2c9f3e933c08f52f13420321d9187293feddea03e0c8c5 WatchSource:0}: Error finding container 5015e00addea3bef9e2c9f3e933c08f52f13420321d9187293feddea03e0c8c5: Status 404 returned error can't find the container with id 5015e00addea3bef9e2c9f3e933c08f52f13420321d9187293feddea03e0c8c5 Apr 22 17:34:38.674508 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.674475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" event={"ID":"e2aa7890-fb2e-4dbf-939b-479f29da74e1","Type":"ContainerStarted","Data":"1a412f923a024e3dfec83e69c5e8fb511f119eb64625241b4864e7bf57913f14"} Apr 22 17:34:38.675496 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.675468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerStarted","Data":"c7c92fa69c5a6e40e17fa21a610a8bb880d50ed7ebfb305ecabd243c76ff2671"} Apr 22 17:34:38.676412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.676384 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" event={"ID":"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033","Type":"ContainerStarted","Data":"889295e3ece5e0a29171139efbcbf671423687ef3c3eaf1cffd836a18a304047"} Apr 22 17:34:38.677340 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:38.677310 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6wq9d" event={"ID":"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178","Type":"ContainerStarted","Data":"5015e00addea3bef9e2c9f3e933c08f52f13420321d9187293feddea03e0c8c5"} Apr 22 17:34:39.188300 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.188203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:39.188300 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.188286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:39.188529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.188380 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:39.188529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.188435 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:39.188529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.188456 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:39.188529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.188463 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:41.188442707 +0000 UTC m=+37.170088634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:39.188529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.188506 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:34:41.188492458 +0000 UTC m=+37.170138385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:39.289262 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.289227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:39.289422 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.289276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:39.289422 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.289395 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:39.289523 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.289421 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:39.289523 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.289475 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:41.289454354 +0000 UTC m=+37.271100288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:39.289523 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:39.289497 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:41.289487103 +0000 UTC m=+37.271133035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:34:39.686966 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.686383 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="f5b09e2ceddf1195fbde413e3f018385c8cdcdbcbd0a74fff9a083b154b8401f" exitCode=0 Apr 22 17:34:39.686966 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:39.686462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"f5b09e2ceddf1195fbde413e3f018385c8cdcdbcbd0a74fff9a083b154b8401f"} Apr 22 17:34:40.711097 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:40.710165 2568 generic.go:358] "Generic (PLEG): container finished" podID="51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81" containerID="bfdbb2d7de1bce7765cbefd8ec4ad4678810b93ec39bddc1c5ddd364e6fa7524" exitCode=0 Apr 22 17:34:40.711097 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:40.710219 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerDied","Data":"bfdbb2d7de1bce7765cbefd8ec4ad4678810b93ec39bddc1c5ddd364e6fa7524"} Apr 22 17:34:41.208542 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.208505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:41.208718 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.208646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:41.208782 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.208751 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:41.208782 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.208776 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:41.208886 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.208797 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:41.208886 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.208844 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.208821787 +0000 UTC m=+41.190467718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:41.208886 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.208864 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.208853453 +0000 UTC m=+41.190499383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:41.309214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.309183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:41.309393 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.309235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:41.309393 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.309346 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:41.309505 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.309407 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.309386578 +0000 UTC m=+41.291032507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:34:41.309708 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.309681 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:41.309825 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:41.309740 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.309724054 +0000 UTC m=+41.291369996 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:41.716323 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.716284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" event={"ID":"51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81","Type":"ContainerStarted","Data":"a5b497f954c221c1622ba811acd366adcb53f45125706d2d80039b0e0e8d6604"} Apr 22 17:34:41.753555 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:41.753471 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9nhgg" podStartSLOduration=6.393367525 podStartE2EDuration="37.753446164s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:07.314608225 +0000 UTC m=+3.296254166" lastFinishedPulling="2026-04-22 17:34:38.674686878 +0000 UTC m=+34.656332805" observedRunningTime="2026-04-22 17:34:41.75165701 +0000 UTC m=+37.733302960" watchObservedRunningTime="2026-04-22 17:34:41.753446164 +0000 UTC m=+37.735092115" Apr 22 17:34:44.028828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:44.028787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:44.033166 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:44.033129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc-original-pull-secret\") pod \"global-pull-secret-syncer-zrp9s\" (UID: \"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc\") " pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:44.157894 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:44.157850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zrp9s" Apr 22 17:34:45.238001 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:45.237952 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.238115 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:45.238189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.238202 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.238180433 +0000 UTC m=+49.219826366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.238265 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.238276 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:45.238492 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.238310 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.23829935 +0000 UTC m=+49.219945300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:45.339709 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:45.339656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:45.339904 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:45.339724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:45.339904 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.339837 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:45.340050 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.339920 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.339895592 +0000 UTC m=+49.321541527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:34:45.340050 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.339838 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:45.340050 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:45.339974 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.339963793 +0000 UTC m=+49.321609725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:46.814550 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:46.814523 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zrp9s"] Apr 22 17:34:46.817729 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:34:46.817686 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8054fdb_27d5_4a8f_9c9c_dd1cf525eedc.slice/crio-42afd496c57681efa673028063d880338ef51b74e5adffb88143da22c08687ae WatchSource:0}: Error finding container 42afd496c57681efa673028063d880338ef51b74e5adffb88143da22c08687ae: Status 404 returned error can't find the container with id 42afd496c57681efa673028063d880338ef51b74e5adffb88143da22c08687ae Apr 22 17:34:47.730279 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.730241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" event={"ID":"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033","Type":"ContainerStarted","Data":"2044a4101ce229e9a148a6039556abbd8529ed8c5955514298128f53f231807e"} Apr 22 17:34:47.730614 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.730595 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:47.731859 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.731821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6wq9d" event={"ID":"bca926f3-b1c0-45e1-9eb2-e0d0fc51b178","Type":"ContainerStarted","Data":"14e8a940b09ff2f65bf0a522fc342de0ecce4251fb47bb98d4b1ed3c3d6a376e"} Apr 22 17:34:47.731987 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.731929 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:34:47.732411 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.732392 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:34:47.733310 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.733290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" event={"ID":"e2aa7890-fb2e-4dbf-939b-479f29da74e1","Type":"ContainerStarted","Data":"c6633d5ad6b4b445f96cc5d0c782196d81a992401a35eeb8f24476665a72eb03"} Apr 22 17:34:47.735196 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.735176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerStarted","Data":"3ef862ab2bd4f463331846a691a4a7d564f30b88f57b2732f7d206f982a77161"} Apr 22 17:34:47.736262 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.736237 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zrp9s" event={"ID":"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc","Type":"ContainerStarted","Data":"42afd496c57681efa673028063d880338ef51b74e5adffb88143da22c08687ae"} Apr 22 17:34:47.749793 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.749751 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" podStartSLOduration=11.707754244 podStartE2EDuration="19.749738633s" podCreationTimestamp="2026-04-22 17:34:28 +0000 UTC" firstStartedPulling="2026-04-22 17:34:38.651166553 +0000 UTC m=+34.632812494" lastFinishedPulling="2026-04-22 17:34:46.693150943 +0000 UTC m=+42.674796883" observedRunningTime="2026-04-22 17:34:47.749352835 +0000 UTC m=+43.730998786" watchObservedRunningTime="2026-04-22 17:34:47.749738633 +0000 UTC m=+43.731384630" Apr 22 17:34:47.784665 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:47.784606 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6wq9d" podStartSLOduration=35.742730382 podStartE2EDuration="43.784588641s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:34:38.651309996 +0000 UTC m=+34.632955925" lastFinishedPulling="2026-04-22 17:34:46.693168242 +0000 UTC m=+42.674814184" observedRunningTime="2026-04-22 17:34:47.784255385 +0000 UTC m=+43.765901347" watchObservedRunningTime="2026-04-22 17:34:47.784588641 +0000 UTC m=+43.766234584" Apr 22 17:34:51.748154 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:51.748125 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerStarted","Data":"5a9e0a93e2489fd9dd3b78b23b507bc0b675a6878190780a60728879cb53f30e"} Apr 22 17:34:51.748525 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:51.748166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerStarted","Data":"a4b152e51f46a14fc5aeb68360a31c40ffdd1f5dddb1daa816b4baedd86d2c1f"} Apr 22 17:34:51.769865 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:51.769820 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" podStartSLOduration=15.728619151 podStartE2EDuration="23.769806092s" podCreationTimestamp="2026-04-22 17:34:28 +0000 UTC" firstStartedPulling="2026-04-22 17:34:38.65120918 +0000 UTC m=+34.632855113" lastFinishedPulling="2026-04-22 17:34:46.692396115 +0000 UTC m=+42.674042054" observedRunningTime="2026-04-22 17:34:47.810958853 +0000 UTC m=+43.792604802" watchObservedRunningTime="2026-04-22 17:34:51.769806092 +0000 UTC m=+47.751452040" Apr 22 17:34:51.770430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:51.770396 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" podStartSLOduration=10.983411637 podStartE2EDuration="23.770388439s" podCreationTimestamp="2026-04-22 17:34:28 +0000 UTC" firstStartedPulling="2026-04-22 17:34:38.651072108 +0000 UTC m=+34.632718039" lastFinishedPulling="2026-04-22 17:34:51.4380489 +0000 UTC m=+47.419694841" observedRunningTime="2026-04-22 17:34:51.769331901 +0000 UTC m=+47.750977851" watchObservedRunningTime="2026-04-22 17:34:51.770388439 +0000 UTC m=+47.752034387" Apr 22 17:34:52.752018 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:52.751978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zrp9s" event={"ID":"b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc","Type":"ContainerStarted","Data":"2cd4ea4b5faf4892148650bd573de29a5f07d3a8672ba166a2e4e83fcc7b4f04"} Apr 22 17:34:52.770331 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:52.770287 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zrp9s" podStartSLOduration=35.861901099 podStartE2EDuration="40.770272946s" podCreationTimestamp="2026-04-22 17:34:12 +0000 UTC" firstStartedPulling="2026-04-22 17:34:46.819782521 +0000 UTC m=+42.801428448" lastFinishedPulling="2026-04-22 17:34:51.728154364 +0000 UTC m=+47.709800295" observedRunningTime="2026-04-22 17:34:52.769032374 +0000 UTC m=+48.750678336" watchObservedRunningTime="2026-04-22 17:34:52.770272946 +0000 UTC m=+48.751918894" Apr 22 17:34:53.311917 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:53.311883 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:34:53.312107 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:53.311958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:34:53.312107 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.312031 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:34:53.312107 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.312055 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:34:53.312107 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.312072 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:34:53.312107 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.312107 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:09.312090942 +0000 UTC m=+65.293736868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:34:53.312273 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.312126 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:35:09.312114883 +0000 UTC m=+65.293760813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:34:53.413341 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:53.413306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:34:53.413529 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:34:53.413355 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:34:53.413529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.413450 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:34:53.413529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.413488 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:34:53.413529 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.413510 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:09.413495907 +0000 UTC m=+65.395141834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:34:53.413683 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:34:53.413533 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:09.413519259 +0000 UTC m=+65.395165187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:35:03.675172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:03.675144 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vckbs" Apr 22 17:35:09.343765 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:09.343719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:09.343782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.343903 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.343912 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.343974 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.344028 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:41.343993356 +0000 UTC m=+97.325639303 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:35:09.344254 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.344069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:35:41.344056502 +0000 UTC m=+97.325702433 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:35:09.445095 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:09.445056 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:35:09.445095 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:09.445103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:35:09.445345 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.445212 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:09.445345 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.445222 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:09.445345 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.445261 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:41.445247913 +0000 UTC m=+97.426893841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:35:09.445345 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:09.445294 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:41.445274495 +0000 UTC m=+97.426920426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:35:10.250993 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:10.250926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:35:10.251175 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:10.251072 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:10.251175 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:10.251137 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:14.251121074 +0000 UTC m=+130.232767005 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : secret "metrics-daemon-secret" not found Apr 22 17:35:18.741018 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:18.740986 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6wq9d" Apr 22 17:35:41.384053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:41.384007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:41.384082 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.384163 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.384222 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.384235 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:45.384218058 +0000 UTC m=+161.365863984 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.384236 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85644b5694-r229r: secret "image-registry-tls" not found Apr 22 17:35:41.384537 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.384288 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls podName:ac05f344-d6fd-4e33-aefe-4328398b5faf nodeName:}" failed. No retries permitted until 2026-04-22 17:36:45.384271551 +0000 UTC m=+161.365917493 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls") pod "image-registry-85644b5694-r229r" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf") : secret "image-registry-tls" not found Apr 22 17:35:41.485269 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:41.485230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:35:41.485269 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:35:41.485278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:35:41.485440 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.485373 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:41.485440 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.485378 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:41.485440 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.485435 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert podName:3abfaf4a-ddaa-41da-ad6c-d710aa0ba979 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:45.485421252 +0000 UTC m=+161.467067179 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert") pod "ingress-canary-dp46v" (UID: "3abfaf4a-ddaa-41da-ad6c-d710aa0ba979") : secret "canary-serving-cert" not found Apr 22 17:35:41.485548 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:35:41.485448 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:45.485442013 +0000 UTC m=+161.467087940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:36:14.322846 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:14.322797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:36:14.323452 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:14.322967 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:36:14.323452 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:14.323040 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs podName:71cf0a1f-9d8d-4195-9355-be900422df45 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:16.323024141 +0000 UTC m=+252.304670069 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs") pod "network-metrics-daemon-cb4rf" (UID: "71cf0a1f-9d8d-4195-9355-be900422df45") : secret "metrics-daemon-secret" not found Apr 22 17:36:28.361079 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:28.361050 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-snnsr_4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc/dns-node-resolver/0.log" Apr 22 17:36:28.961390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:28.961364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c6mcn_413cd8ee-53a3-4bf3-9251-11b54262bb83/node-ca/0.log" Apr 22 17:36:39.615046 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.615014 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-w688w"] Apr 22 17:36:39.620374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.620347 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.622713 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.622693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:36:39.622849 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.622751 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-c22fx\"" Apr 22 17:36:39.622849 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.622807 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:36:39.623672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.623641 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:36:39.623784 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.623681 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:36:39.628357 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.628331 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w688w"] Apr 22 17:36:39.714651 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.714608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39adb942-f356-4943-b0ff-5b516ca93173-crio-socket\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.714825 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.714713 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb4r6\" (UniqueName: \"kubernetes.io/projected/39adb942-f356-4943-b0ff-5b516ca93173-kube-api-access-cb4r6\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.714825 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.714790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39adb942-f356-4943-b0ff-5b516ca93173-data-volume\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.714902 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.714820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.714902 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.714860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39adb942-f356-4943-b0ff-5b516ca93173-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815426 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815374 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815585 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39adb942-f356-4943-b0ff-5b516ca93173-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815585 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39adb942-f356-4943-b0ff-5b516ca93173-crio-socket\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815585 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:39.815531 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.815585 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb4r6\" (UniqueName: \"kubernetes.io/projected/39adb942-f356-4943-b0ff-5b516ca93173-kube-api-access-cb4r6\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815719 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:39.815614 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls podName:39adb942-f356-4943-b0ff-5b516ca93173 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:40.31559389 +0000 UTC m=+156.297239823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls") pod "insights-runtime-extractor-w688w" (UID: "39adb942-f356-4943-b0ff-5b516ca93173") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:39.815719 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39adb942-f356-4943-b0ff-5b516ca93173-crio-socket\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.815807 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.815728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39adb942-f356-4943-b0ff-5b516ca93173-data-volume\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.816024 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.816005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39adb942-f356-4943-b0ff-5b516ca93173-data-volume\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.816102 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.816041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39adb942-f356-4943-b0ff-5b516ca93173-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:39.825429 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:39.825397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb4r6\" (UniqueName: \"kubernetes.io/projected/39adb942-f356-4943-b0ff-5b516ca93173-kube-api-access-cb4r6\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:40.319376 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:40.319330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:40.319577 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.319467 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.319577 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.319538 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls podName:39adb942-f356-4943-b0ff-5b516ca93173 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:41.319521992 +0000 UTC m=+157.301167919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls") pod "insights-runtime-extractor-w688w" (UID: "39adb942-f356-4943-b0ff-5b516ca93173") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:40.467183 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.467139 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-85644b5694-r229r" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" Apr 22 17:36:40.474287 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.474251 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" podUID="47363b61-021b-4c33-bee9-3f7fb1dc9969" Apr 22 17:36:40.526628 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.526590 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5n8fn" podUID="21fe7e60-7b3d-470d-8f9f-63da9bbccfa6" Apr 22 17:36:40.535693 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.535654 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dp46v" podUID="3abfaf4a-ddaa-41da-ad6c-d710aa0ba979" Apr 22 17:36:40.543928 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:40.543889 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cb4rf" podUID="71cf0a1f-9d8d-4195-9355-be900422df45" Apr 22 17:36:41.013584 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:41.013554 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:36:41.013989 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:41.013660 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:36:41.013989 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:41.013831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:36:41.013989 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:41.013958 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:36:41.329024 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:41.328998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:41.329189 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:41.329110 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:41.329189 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:41.329167 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls podName:39adb942-f356-4943-b0ff-5b516ca93173 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:43.329152122 +0000 UTC m=+159.310798049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls") pod "insights-runtime-extractor-w688w" (UID: "39adb942-f356-4943-b0ff-5b516ca93173") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:43.345697 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:43.345655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:43.346123 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:43.345798 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:36:43.346123 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:43.345863 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls podName:39adb942-f356-4943-b0ff-5b516ca93173 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:47.345847324 +0000 UTC m=+163.327493251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls") pod "insights-runtime-extractor-w688w" (UID: "39adb942-f356-4943-b0ff-5b516ca93173") : secret "insights-runtime-extractor-tls" not found Apr 22 17:36:45.463672 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.463626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:36:45.464311 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.463702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:36:45.464311 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:45.463798 2568 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 17:36:45.464311 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:45.463889 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert podName:47363b61-021b-4c33-bee9-3f7fb1dc9969 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:47.463866854 +0000 UTC m=+283.445512787 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wnwfv" (UID: "47363b61-021b-4c33-bee9-3f7fb1dc9969") : secret "networking-console-plugin-cert" not found Apr 22 17:36:45.466133 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.466112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"image-registry-85644b5694-r229r\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:36:45.516999 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.516966 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rmqdq\"" Apr 22 17:36:45.525198 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.525171 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:36:45.564655 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.564624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:36:45.564835 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.564662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:36:45.564835 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:45.564788 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:36:45.564968 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:36:45.564869 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls podName:21fe7e60-7b3d-470d-8f9f-63da9bbccfa6 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:47.564845505 +0000 UTC m=+283.546491448 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls") pod "dns-default-5n8fn" (UID: "21fe7e60-7b3d-470d-8f9f-63da9bbccfa6") : secret "dns-default-metrics-tls" not found Apr 22 17:36:45.567027 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.567005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3abfaf4a-ddaa-41da-ad6c-d710aa0ba979-cert\") pod \"ingress-canary-dp46v\" (UID: \"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979\") " pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:36:45.647181 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.647147 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:36:45.649890 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:36:45.649866 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac05f344_d6fd_4e33_aefe_4328398b5faf.slice/crio-18882a5d0b3ece9d7145da18dc3a75627a3840c82dbe3960bc7a44582bf510ba WatchSource:0}: Error finding container 18882a5d0b3ece9d7145da18dc3a75627a3840c82dbe3960bc7a44582bf510ba: Status 404 returned error can't find the container with id 18882a5d0b3ece9d7145da18dc3a75627a3840c82dbe3960bc7a44582bf510ba Apr 22 17:36:45.817192 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.817102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dsn9k\"" Apr 22 17:36:45.825193 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.825174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp46v" Apr 22 17:36:45.945619 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:45.945590 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp46v"] Apr 22 17:36:45.950188 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:36:45.950157 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abfaf4a_ddaa_41da_ad6c_d710aa0ba979.slice/crio-8bf54cb9f96c911d5c30d114462731ae9aa446da516fed2ed569dd8bcbe9872e WatchSource:0}: Error finding container 8bf54cb9f96c911d5c30d114462731ae9aa446da516fed2ed569dd8bcbe9872e: Status 404 returned error can't find the container with id 8bf54cb9f96c911d5c30d114462731ae9aa446da516fed2ed569dd8bcbe9872e Apr 22 17:36:46.025406 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:46.025369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp46v" event={"ID":"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979","Type":"ContainerStarted","Data":"8bf54cb9f96c911d5c30d114462731ae9aa446da516fed2ed569dd8bcbe9872e"} Apr 22 17:36:46.026710 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:46.026674 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85644b5694-r229r" event={"ID":"ac05f344-d6fd-4e33-aefe-4328398b5faf","Type":"ContainerStarted","Data":"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e"} Apr 22 17:36:46.026710 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:46.026705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85644b5694-r229r" event={"ID":"ac05f344-d6fd-4e33-aefe-4328398b5faf","Type":"ContainerStarted","Data":"18882a5d0b3ece9d7145da18dc3a75627a3840c82dbe3960bc7a44582bf510ba"} Apr 22 17:36:46.026875 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:46.026793 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:36:46.046485 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:46.046431 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85644b5694-r229r" podStartSLOduration=162.046416403 podStartE2EDuration="2m42.046416403s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:36:46.045647457 +0000 UTC m=+162.027293431" watchObservedRunningTime="2026-04-22 17:36:46.046416403 +0000 UTC m=+162.028062352" Apr 22 17:36:47.030679 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.030640 2568 generic.go:358] "Generic (PLEG): container finished" podID="e2aa7890-fb2e-4dbf-939b-479f29da74e1" containerID="c6633d5ad6b4b445f96cc5d0c782196d81a992401a35eeb8f24476665a72eb03" exitCode=255 Apr 22 17:36:47.031176 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.030716 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" event={"ID":"e2aa7890-fb2e-4dbf-939b-479f29da74e1","Type":"ContainerDied","Data":"c6633d5ad6b4b445f96cc5d0c782196d81a992401a35eeb8f24476665a72eb03"} Apr 22 17:36:47.031323 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.031305 2568 scope.go:117] "RemoveContainer" containerID="c6633d5ad6b4b445f96cc5d0c782196d81a992401a35eeb8f24476665a72eb03" Apr 22 17:36:47.032439 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.032391 2568 generic.go:358] "Generic (PLEG): container finished" podID="4b0b3cbd-6bf3-41d3-a5a0-f236662c2033" containerID="2044a4101ce229e9a148a6039556abbd8529ed8c5955514298128f53f231807e" exitCode=1 Apr 22 17:36:47.032531 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.032431 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" event={"ID":"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033","Type":"ContainerDied","Data":"2044a4101ce229e9a148a6039556abbd8529ed8c5955514298128f53f231807e"} Apr 22 17:36:47.032952 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.032865 2568 scope.go:117] "RemoveContainer" containerID="2044a4101ce229e9a148a6039556abbd8529ed8c5955514298128f53f231807e" Apr 22 17:36:47.380999 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.380961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:47.383904 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.383868 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39adb942-f356-4943-b0ff-5b516ca93173-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w688w\" (UID: \"39adb942-f356-4943-b0ff-5b516ca93173\") " pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:47.429514 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.429479 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w688w" Apr 22 17:36:47.688097 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.688048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w688w"] Apr 22 17:36:47.693761 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:36:47.693730 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39adb942_f356_4943_b0ff_5b516ca93173.slice/crio-544898614e89cab1f9d05d5b2bd7c9e6f85b77186271fc4a76711988905f68d0 WatchSource:0}: Error finding container 544898614e89cab1f9d05d5b2bd7c9e6f85b77186271fc4a76711988905f68d0: Status 404 returned error can't find the container with id 544898614e89cab1f9d05d5b2bd7c9e6f85b77186271fc4a76711988905f68d0 Apr 22 17:36:47.730925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.730881 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:36:47.781858 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.781818 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:36:47.795630 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:47.795579 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" Apr 22 17:36:48.036553 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.036454 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w688w" event={"ID":"39adb942-f356-4943-b0ff-5b516ca93173","Type":"ContainerStarted","Data":"4f495f544f9e00609cc078cc259b33236793f47ce4ca92a93266df256bbdbcbf"} Apr 22 17:36:48.036553 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.036492 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w688w" event={"ID":"39adb942-f356-4943-b0ff-5b516ca93173","Type":"ContainerStarted","Data":"544898614e89cab1f9d05d5b2bd7c9e6f85b77186271fc4a76711988905f68d0"} Apr 22 17:36:48.038008 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.037980 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6769747df7-s6hdx" event={"ID":"e2aa7890-fb2e-4dbf-939b-479f29da74e1","Type":"ContainerStarted","Data":"4c17328b9d4b9828bc985af760c9b260614cbc662593b5ee97ea1b2e1df17b3b"} Apr 22 17:36:48.039545 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.039520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" event={"ID":"4b0b3cbd-6bf3-41d3-a5a0-f236662c2033","Type":"ContainerStarted","Data":"c448a8fe76985dc7a15b7940df4a0b19d8c2a8cff9952967936d7b19ef528ea7"} Apr 22 17:36:48.039754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.039732 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:36:48.040359 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.040341 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8487cc97d4-fq6gp" Apr 22 17:36:48.040808 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.040784 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp46v" event={"ID":"3abfaf4a-ddaa-41da-ad6c-d710aa0ba979","Type":"ContainerStarted","Data":"0dc9f4d117ff5ffb354ef2295ba35640161e782b70d5c47886068e58064787bd"} Apr 22 17:36:48.066341 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:48.066287 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dp46v" podStartSLOduration=129.393511399 podStartE2EDuration="2m11.066271237s" podCreationTimestamp="2026-04-22 17:34:37 +0000 UTC" firstStartedPulling="2026-04-22 17:36:45.952089571 +0000 UTC m=+161.933735498" lastFinishedPulling="2026-04-22 17:36:47.624849396 +0000 UTC m=+163.606495336" observedRunningTime="2026-04-22 17:36:48.06552251 +0000 UTC m=+164.047168460" watchObservedRunningTime="2026-04-22 17:36:48.066271237 +0000 UTC m=+164.047917209" Apr 22 17:36:49.044513 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:49.044471 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w688w" event={"ID":"39adb942-f356-4943-b0ff-5b516ca93173","Type":"ContainerStarted","Data":"bb768a3f1a6bd56dea7c8b7e1c9b1271844bbb9c12ca0b1e625557a4b1f042f8"} Apr 22 17:36:50.048895 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:50.048862 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w688w" event={"ID":"39adb942-f356-4943-b0ff-5b516ca93173","Type":"ContainerStarted","Data":"ebf5cda5e835fd289a8857249d22466bb4a3fd3e8be0d922f9f5735c4b4754d4"} Apr 22 17:36:50.066815 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:50.066770 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-w688w" podStartSLOduration=8.974904629 podStartE2EDuration="11.066756779s" podCreationTimestamp="2026-04-22 17:36:39 +0000 UTC" firstStartedPulling="2026-04-22 17:36:47.751118029 +0000 UTC m=+163.732763956" lastFinishedPulling="2026-04-22 17:36:49.842970166 +0000 UTC m=+165.824616106" observedRunningTime="2026-04-22 17:36:50.065104557 +0000 UTC m=+166.046750531" watchObservedRunningTime="2026-04-22 17:36:50.066756779 +0000 UTC m=+166.048402727" Apr 22 17:36:52.528230 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:36:52.528197 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:37:05.529700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.529611 2568 patch_prober.go:28] interesting pod/image-registry-85644b5694-r229r container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:05.529700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.529666 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-85644b5694-r229r" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:05.973504 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.973465 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5xpcn"] Apr 22 17:37:05.976102 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.976082 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:05.978879 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.978679 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:37:05.979036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.978930 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:37:05.979111 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.979061 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:37:05.979192 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.979168 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:37:05.980355 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.980328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mbzq\"" Apr 22 17:37:05.980641 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.980619 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:37:05.980888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:05.980867 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:37:06.022378 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-sys\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022572 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022572 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-metrics-client-ca\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022572 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-root\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-wtmp\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022636 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-textfile\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-accelerators-collector-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72gx\" (UniqueName: \"kubernetes.io/projected/58d47ea9-316e-41f3-8a86-33b4f43187fe-kube-api-access-c72gx\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.022930 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.022788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-tls\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123506 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-metrics-client-ca\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123506 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-root\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123769 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-wtmp\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123769 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-textfile\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123769 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123709 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-root\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.123769 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123743 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-accelerators-collector-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-wtmp\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72gx\" (UniqueName: \"kubernetes.io/projected/58d47ea9-316e-41f3-8a86-33b4f43187fe-kube-api-access-c72gx\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123879 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-tls\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-sys\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-textfile\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.123989 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.124089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/58d47ea9-316e-41f3-8a86-33b4f43187fe-sys\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.124169 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-metrics-client-ca\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.124308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.124292 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-accelerators-collector-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.126205 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.126183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.126299 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.126243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/58d47ea9-316e-41f3-8a86-33b4f43187fe-node-exporter-tls\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.139474 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.139440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72gx\" (UniqueName: \"kubernetes.io/projected/58d47ea9-316e-41f3-8a86-33b4f43187fe-kube-api-access-c72gx\") pod \"node-exporter-5xpcn\" (UID: \"58d47ea9-316e-41f3-8a86-33b4f43187fe\") " pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.287395 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:06.287311 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5xpcn" Apr 22 17:37:06.296707 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:37:06.296681 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d47ea9_316e_41f3_8a86_33b4f43187fe.slice/crio-47f36c8ce24f144a1b2fd7df55e14f6a2a85961cef648f6cf152e08906943b89 WatchSource:0}: Error finding container 47f36c8ce24f144a1b2fd7df55e14f6a2a85961cef648f6cf152e08906943b89: Status 404 returned error can't find the container with id 47f36c8ce24f144a1b2fd7df55e14f6a2a85961cef648f6cf152e08906943b89 Apr 22 17:37:07.038481 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:07.038448 2568 patch_prober.go:28] interesting pod/image-registry-85644b5694-r229r container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:07.038859 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:07.038514 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-85644b5694-r229r" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:07.090296 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:07.090260 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5xpcn" event={"ID":"58d47ea9-316e-41f3-8a86-33b4f43187fe","Type":"ContainerStarted","Data":"47f36c8ce24f144a1b2fd7df55e14f6a2a85961cef648f6cf152e08906943b89"} Apr 22 17:37:08.093993 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:08.093927 2568 generic.go:358] "Generic (PLEG): container finished" podID="58d47ea9-316e-41f3-8a86-33b4f43187fe" containerID="f9ac26e46efc7b2bd810233307e1982ac48ca65b22b46ef5e963869a7b87f0a0" exitCode=0 Apr 22 17:37:08.094483 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:08.094004 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5xpcn" event={"ID":"58d47ea9-316e-41f3-8a86-33b4f43187fe","Type":"ContainerDied","Data":"f9ac26e46efc7b2bd810233307e1982ac48ca65b22b46ef5e963869a7b87f0a0"} Apr 22 17:37:09.101418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:09.101382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5xpcn" event={"ID":"58d47ea9-316e-41f3-8a86-33b4f43187fe","Type":"ContainerStarted","Data":"abebbbedb6dae30fe81259bb699709956b467f37a2ac9b43a1b6f14bfc6a6e04"} Apr 22 17:37:09.101418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:09.101418 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5xpcn" event={"ID":"58d47ea9-316e-41f3-8a86-33b4f43187fe","Type":"ContainerStarted","Data":"be6498540f26eb89606e658712dfec165c5eb3d9c3d142289f918459a1262188"} Apr 22 17:37:09.121797 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:09.121744 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5xpcn" podStartSLOduration=3.242765633 podStartE2EDuration="4.121730297s" podCreationTimestamp="2026-04-22 17:37:05 +0000 UTC" firstStartedPulling="2026-04-22 17:37:06.298817627 +0000 UTC m=+182.280463554" lastFinishedPulling="2026-04-22 17:37:07.177782274 +0000 UTC m=+183.159428218" observedRunningTime="2026-04-22 17:37:09.120758267 +0000 UTC m=+185.102404217" watchObservedRunningTime="2026-04-22 17:37:09.121730297 +0000 UTC m=+185.103376243" Apr 22 17:37:15.529700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:15.529664 2568 patch_prober.go:28] interesting pod/image-registry-85644b5694-r229r container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:37:15.530209 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:15.529724 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-85644b5694-r229r" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:37:17.036395 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:17.036366 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:37:21.543116 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:21.543077 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:37:27.805760 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:27.805719 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" podUID="dd6771b1-2f0a-4818-8710-c7d543723c88" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:37:37.805430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:37.805391 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" podUID="dd6771b1-2f0a-4818-8710-c7d543723c88" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:37:46.561585 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.561527 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-85644b5694-r229r" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" containerID="cri-o://b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e" gracePeriod=30 Apr 22 17:37:46.783013 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.782992 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:37:46.949654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949623 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949661 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949679 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949703 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z96pm\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949758 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949790 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.949882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.949824 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.951036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.950014 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca\") pod \"ac05f344-d6fd-4e33-aefe-4328398b5faf\" (UID: \"ac05f344-d6fd-4e33-aefe-4328398b5faf\") " Apr 22 17:37:46.951036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.950308 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:37:46.951036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.950661 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:37:46.952276 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.952223 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:37:46.952385 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.952270 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:46.952385 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.952280 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm" (OuterVolumeSpecName: "kube-api-access-z96pm") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "kube-api-access-z96pm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:46.952488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.952375 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:37:46.952488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.952427 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:37:46.958079 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:46.958054 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ac05f344-d6fd-4e33-aefe-4328398b5faf" (UID: "ac05f344-d6fd-4e33-aefe-4328398b5faf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:37:47.051208 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051171 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac05f344-d6fd-4e33-aefe-4328398b5faf-ca-trust-extracted\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051208 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051205 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-bound-sa-token\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051208 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051216 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-tls\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051225 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z96pm\" (UniqueName: \"kubernetes.io/projected/ac05f344-d6fd-4e33-aefe-4328398b5faf-kube-api-access-z96pm\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051235 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-image-registry-private-configuration\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051244 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-registry-certificates\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051254 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac05f344-d6fd-4e33-aefe-4328398b5faf-installation-pull-secrets\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.051374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.051262 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac05f344-d6fd-4e33-aefe-4328398b5faf-trusted-ca\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:37:47.196751 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.196716 2568 generic.go:358] "Generic (PLEG): container finished" podID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerID="b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e" exitCode=0 Apr 22 17:37:47.196906 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.196776 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85644b5694-r229r" Apr 22 17:37:47.196906 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.196791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85644b5694-r229r" event={"ID":"ac05f344-d6fd-4e33-aefe-4328398b5faf","Type":"ContainerDied","Data":"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e"} Apr 22 17:37:47.196906 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.196818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85644b5694-r229r" event={"ID":"ac05f344-d6fd-4e33-aefe-4328398b5faf","Type":"ContainerDied","Data":"18882a5d0b3ece9d7145da18dc3a75627a3840c82dbe3960bc7a44582bf510ba"} Apr 22 17:37:47.196906 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.196832 2568 scope.go:117] "RemoveContainer" containerID="b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e" Apr 22 17:37:47.204386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.204369 2568 scope.go:117] "RemoveContainer" containerID="b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e" Apr 22 17:37:47.204645 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:37:47.204623 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e\": container with ID starting with b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e not found: ID does not exist" containerID="b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e" Apr 22 17:37:47.204709 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.204651 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e"} err="failed to get container status \"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e\": rpc error: code = NotFound desc = could not find container \"b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e\": container with ID starting with b2f87a8304fcbc56cd21f1dc712098be99b02197c5acf562650e91e0a15ad03e not found: ID does not exist" Apr 22 17:37:47.218462 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.218442 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:37:47.224705 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.224685 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85644b5694-r229r"] Apr 22 17:37:47.805162 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.805122 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" podUID="dd6771b1-2f0a-4818-8710-c7d543723c88" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:37:47.805623 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.805217 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" Apr 22 17:37:47.805667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.805645 2568 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"5a9e0a93e2489fd9dd3b78b23b507bc0b675a6878190780a60728879cb53f30e"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 17:37:47.805701 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:47.805679 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" podUID="dd6771b1-2f0a-4818-8710-c7d543723c88" containerName="service-proxy" containerID="cri-o://5a9e0a93e2489fd9dd3b78b23b507bc0b675a6878190780a60728879cb53f30e" gracePeriod=30 Apr 22 17:37:48.201774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:48.201735 2568 generic.go:358] "Generic (PLEG): container finished" podID="dd6771b1-2f0a-4818-8710-c7d543723c88" containerID="5a9e0a93e2489fd9dd3b78b23b507bc0b675a6878190780a60728879cb53f30e" exitCode=2 Apr 22 17:37:48.201986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:48.201780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerDied","Data":"5a9e0a93e2489fd9dd3b78b23b507bc0b675a6878190780a60728879cb53f30e"} Apr 22 17:37:48.201986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:48.201810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-fd854664f-fn89n" event={"ID":"dd6771b1-2f0a-4818-8710-c7d543723c88","Type":"ContainerStarted","Data":"c7b893d3f4aae12b9fc68cce82d349c1ba6c8c7d0f024a2c129cd5974cf3fdeb"} Apr 22 17:37:48.531784 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:37:48.531705 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" path="/var/lib/kubelet/pods/ac05f344-d6fd-4e33-aefe-4328398b5faf/volumes" Apr 22 17:38:16.366051 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:16.366009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:38:16.368262 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:16.368242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/71cf0a1f-9d8d-4195-9355-be900422df45-metrics-certs\") pod \"network-metrics-daemon-cb4rf\" (UID: \"71cf0a1f-9d8d-4195-9355-be900422df45\") " pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:38:16.531188 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:16.531159 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kkblr\"" Apr 22 17:38:16.538994 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:16.538976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cb4rf" Apr 22 17:38:16.652323 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:16.652238 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cb4rf"] Apr 22 17:38:16.656308 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:38:16.656284 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71cf0a1f_9d8d_4195_9355_be900422df45.slice/crio-227a9fc1784de07475ee0dc446821123d9eba06ab5ae74961c143fc447d8da28 WatchSource:0}: Error finding container 227a9fc1784de07475ee0dc446821123d9eba06ab5ae74961c143fc447d8da28: Status 404 returned error can't find the container with id 227a9fc1784de07475ee0dc446821123d9eba06ab5ae74961c143fc447d8da28 Apr 22 17:38:17.275471 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:17.275429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb4rf" event={"ID":"71cf0a1f-9d8d-4195-9355-be900422df45","Type":"ContainerStarted","Data":"227a9fc1784de07475ee0dc446821123d9eba06ab5ae74961c143fc447d8da28"} Apr 22 17:38:18.280275 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:18.280239 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb4rf" event={"ID":"71cf0a1f-9d8d-4195-9355-be900422df45","Type":"ContainerStarted","Data":"76da6f938cd322675b5404d8fb40b74117924c44640b7fed88e72d3537b2a2d9"} Apr 22 17:38:18.280275 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:18.280275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cb4rf" event={"ID":"71cf0a1f-9d8d-4195-9355-be900422df45","Type":"ContainerStarted","Data":"5e93f9ee96ea5672559efdb3c6ecc2864a4fb10bba6109a87b993accf18e255f"} Apr 22 17:38:18.297277 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:18.297239 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cb4rf" podStartSLOduration=253.42137835 podStartE2EDuration="4m14.29722646s" podCreationTimestamp="2026-04-22 17:34:04 +0000 UTC" firstStartedPulling="2026-04-22 17:38:16.658085193 +0000 UTC m=+252.639731123" lastFinishedPulling="2026-04-22 17:38:17.533933305 +0000 UTC m=+253.515579233" observedRunningTime="2026-04-22 17:38:18.296185417 +0000 UTC m=+254.277831519" watchObservedRunningTime="2026-04-22 17:38:18.29722646 +0000 UTC m=+254.278872408" Apr 22 17:38:44.014842 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:38:44.014791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" podUID="47363b61-021b-4c33-bee9-3f7fb1dc9969" Apr 22 17:38:44.014842 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:38:44.014791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5n8fn" podUID="21fe7e60-7b3d-470d-8f9f-63da9bbccfa6" Apr 22 17:38:44.348149 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:44.348116 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:38:44.348309 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:44.348117 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:38:47.482602 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.482570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:38:47.484899 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.484876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47363b61-021b-4c33-bee9-3f7fb1dc9969-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wnwfv\" (UID: \"47363b61-021b-4c33-bee9-3f7fb1dc9969\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:38:47.583206 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.583162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:38:47.585522 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.585488 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/21fe7e60-7b3d-470d-8f9f-63da9bbccfa6-metrics-tls\") pod \"dns-default-5n8fn\" (UID: \"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6\") " pod="openshift-dns/dns-default-5n8fn" Apr 22 17:38:47.651097 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.651067 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7g4b5\"" Apr 22 17:38:47.651870 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.651838 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dsn92\"" Apr 22 17:38:47.659798 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.659775 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" Apr 22 17:38:47.659921 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.659816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:38:47.798667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.798552 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5n8fn"] Apr 22 17:38:47.801268 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:38:47.801212 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21fe7e60_7b3d_470d_8f9f_63da9bbccfa6.slice/crio-29743d05ec064eb6ac5b49a3df3d801475946bc287fcbb3646d2c1a79251968e WatchSource:0}: Error finding container 29743d05ec064eb6ac5b49a3df3d801475946bc287fcbb3646d2c1a79251968e: Status 404 returned error can't find the container with id 29743d05ec064eb6ac5b49a3df3d801475946bc287fcbb3646d2c1a79251968e Apr 22 17:38:47.809220 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:47.809191 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv"] Apr 22 17:38:47.812572 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:38:47.812545 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47363b61_021b_4c33_bee9_3f7fb1dc9969.slice/crio-13172fa6c730312c75f10c570b37ff450b19086d4c38fe52785025a90d129c58 WatchSource:0}: Error finding container 13172fa6c730312c75f10c570b37ff450b19086d4c38fe52785025a90d129c58: Status 404 returned error can't find the container with id 13172fa6c730312c75f10c570b37ff450b19086d4c38fe52785025a90d129c58 Apr 22 17:38:48.358596 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:48.358551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" event={"ID":"47363b61-021b-4c33-bee9-3f7fb1dc9969","Type":"ContainerStarted","Data":"13172fa6c730312c75f10c570b37ff450b19086d4c38fe52785025a90d129c58"} Apr 22 17:38:48.359634 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:48.359606 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n8fn" event={"ID":"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6","Type":"ContainerStarted","Data":"29743d05ec064eb6ac5b49a3df3d801475946bc287fcbb3646d2c1a79251968e"} Apr 22 17:38:49.367695 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:49.367580 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" event={"ID":"47363b61-021b-4c33-bee9-3f7fb1dc9969","Type":"ContainerStarted","Data":"009f8cc03a9bc2054f15329990cc0d7e4d5de2fba3184fc4de9f77a2f5e5bd9c"} Apr 22 17:38:49.369258 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:49.369233 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n8fn" event={"ID":"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6","Type":"ContainerStarted","Data":"71a11310acd33f8f6b24e725a8fcf098e25dcc047d0a593e952f7924003da3f2"} Apr 22 17:38:49.383682 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:49.383585 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wnwfv" podStartSLOduration=267.970105114 podStartE2EDuration="4m29.383566822s" podCreationTimestamp="2026-04-22 17:34:20 +0000 UTC" firstStartedPulling="2026-04-22 17:38:47.814308672 +0000 UTC m=+283.795954599" lastFinishedPulling="2026-04-22 17:38:49.227770381 +0000 UTC m=+285.209416307" observedRunningTime="2026-04-22 17:38:49.382733599 +0000 UTC m=+285.364379548" watchObservedRunningTime="2026-04-22 17:38:49.383566822 +0000 UTC m=+285.365212770" Apr 22 17:38:50.373469 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:50.373433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n8fn" event={"ID":"21fe7e60-7b3d-470d-8f9f-63da9bbccfa6","Type":"ContainerStarted","Data":"438e33e55fcd91b3bba0cb9b4531a640bd6dc2c7e8845eafea5c101109fded77"} Apr 22 17:38:50.390927 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:50.390861 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5n8fn" podStartSLOduration=251.966407107 podStartE2EDuration="4m13.390842679s" podCreationTimestamp="2026-04-22 17:34:37 +0000 UTC" firstStartedPulling="2026-04-22 17:38:47.803144274 +0000 UTC m=+283.784790203" lastFinishedPulling="2026-04-22 17:38:49.227579835 +0000 UTC m=+285.209225775" observedRunningTime="2026-04-22 17:38:50.390020009 +0000 UTC m=+286.371665957" watchObservedRunningTime="2026-04-22 17:38:50.390842679 +0000 UTC m=+286.372488628" Apr 22 17:38:51.376272 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:38:51.376240 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:39:01.382601 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:39:01.382568 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5n8fn" Apr 22 17:39:04.420983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:39:04.420929 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:39:04.420983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:39:04.420977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:39:04.425296 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:39:04.425272 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:42:21.940895 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.940861 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5"] Apr 22 17:42:21.941390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.941103 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" Apr 22 17:42:21.941390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.941115 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" Apr 22 17:42:21.941390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.941169 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac05f344-d6fd-4e33-aefe-4328398b5faf" containerName="registry" Apr 22 17:42:21.943819 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.943801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:21.946350 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.946329 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:42:21.946485 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.946463 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 17:42:21.946535 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.946492 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-29h8d\"" Apr 22 17:42:21.956388 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:21.956357 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5"] Apr 22 17:42:22.126041 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.125996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f13884f1-e0d5-40a8-b393-04e369c3b14a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.126221 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.126064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhtd\" (UniqueName: \"kubernetes.io/projected/f13884f1-e0d5-40a8-b393-04e369c3b14a-kube-api-access-4xhtd\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.226690 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.226596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhtd\" (UniqueName: \"kubernetes.io/projected/f13884f1-e0d5-40a8-b393-04e369c3b14a-kube-api-access-4xhtd\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.226836 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.226742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f13884f1-e0d5-40a8-b393-04e369c3b14a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.227152 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.227135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f13884f1-e0d5-40a8-b393-04e369c3b14a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.234535 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.234510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhtd\" (UniqueName: \"kubernetes.io/projected/f13884f1-e0d5-40a8-b393-04e369c3b14a-kube-api-access-4xhtd\") pod \"cert-manager-operator-controller-manager-54b9655956-l5dj5\" (UID: \"f13884f1-e0d5-40a8-b393-04e369c3b14a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.252364 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.252323 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" Apr 22 17:42:22.381959 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.381910 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5"] Apr 22 17:42:22.386761 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:42:22.386732 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13884f1_e0d5_40a8_b393_04e369c3b14a.slice/crio-0163f1dcc8728dbfd019e741a3f6f12c154cebc6296f17450e0a662b26f346fe WatchSource:0}: Error finding container 0163f1dcc8728dbfd019e741a3f6f12c154cebc6296f17450e0a662b26f346fe: Status 404 returned error can't find the container with id 0163f1dcc8728dbfd019e741a3f6f12c154cebc6296f17450e0a662b26f346fe Apr 22 17:42:22.389876 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.389861 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:42:22.926322 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:22.926284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" event={"ID":"f13884f1-e0d5-40a8-b393-04e369c3b14a","Type":"ContainerStarted","Data":"0163f1dcc8728dbfd019e741a3f6f12c154cebc6296f17450e0a662b26f346fe"} Apr 22 17:42:25.937182 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:25.937145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" event={"ID":"f13884f1-e0d5-40a8-b393-04e369c3b14a","Type":"ContainerStarted","Data":"73a7810edd6ff4bc87f310c258d1313bc427af72c9f82a7a7fb44ef595816ce0"} Apr 22 17:42:25.958871 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:25.958820 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-l5dj5" podStartSLOduration=2.424056495 podStartE2EDuration="4.958804814s" podCreationTimestamp="2026-04-22 17:42:21 +0000 UTC" firstStartedPulling="2026-04-22 17:42:22.390013589 +0000 UTC m=+498.371659516" lastFinishedPulling="2026-04-22 17:42:24.924761904 +0000 UTC m=+500.906407835" observedRunningTime="2026-04-22 17:42:25.957168875 +0000 UTC m=+501.938814825" watchObservedRunningTime="2026-04-22 17:42:25.958804814 +0000 UTC m=+501.940450762" Apr 22 17:42:28.249465 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.249422 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d58kb"] Apr 22 17:42:28.252732 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.252708 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.255107 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.255076 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-f4m9l\"" Apr 22 17:42:28.255244 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.255118 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 17:42:28.256084 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.256067 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 17:42:28.262372 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.262345 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d58kb"] Apr 22 17:42:28.272298 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.272258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6ll\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-kube-api-access-dq6ll\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.272443 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.272390 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.373229 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.373190 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6ll\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-kube-api-access-dq6ll\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.373411 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.373253 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.382305 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.382279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.382494 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.382470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6ll\" (UniqueName: \"kubernetes.io/projected/9df2d9fa-e684-4659-b50b-b67893063e24-kube-api-access-dq6ll\") pod \"cert-manager-webhook-587ccfb98-d58kb\" (UID: \"9df2d9fa-e684-4659-b50b-b67893063e24\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.562598 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.562503 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:28.688272 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.688237 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d58kb"] Apr 22 17:42:28.691898 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:42:28.691868 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9df2d9fa_e684_4659_b50b_b67893063e24.slice/crio-611e642541caf587b962f74550981d1faabdea39031da2c6150613b01d316ab8 WatchSource:0}: Error finding container 611e642541caf587b962f74550981d1faabdea39031da2c6150613b01d316ab8: Status 404 returned error can't find the container with id 611e642541caf587b962f74550981d1faabdea39031da2c6150613b01d316ab8 Apr 22 17:42:28.951687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:28.951640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" event={"ID":"9df2d9fa-e684-4659-b50b-b67893063e24","Type":"ContainerStarted","Data":"611e642541caf587b962f74550981d1faabdea39031da2c6150613b01d316ab8"} Apr 22 17:42:29.330746 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.330706 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-q2vf9"] Apr 22 17:42:29.335100 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.335079 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.337607 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.337580 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-dz248\"" Apr 22 17:42:29.349308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.349282 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-q2vf9"] Apr 22 17:42:29.380326 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.380282 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.380510 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.380337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsxv2\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-kube-api-access-bsxv2\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.480700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.480662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.480849 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.480709 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsxv2\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-kube-api-access-bsxv2\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.489896 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.489861 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsxv2\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-kube-api-access-bsxv2\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.489896 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.489876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39c2023b-c1f1-4565-b0f2-6d59d11c63fb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-q2vf9\" (UID: \"39c2023b-c1f1-4565-b0f2-6d59d11c63fb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.644204 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.644110 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" Apr 22 17:42:29.783193 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.783156 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-q2vf9"] Apr 22 17:42:29.786713 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:42:29.786667 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c2023b_c1f1_4565_b0f2_6d59d11c63fb.slice/crio-10501ca1f12bfcec86be193f6ebf3ac2f789d5cf0b854dd814e59fa632a88e78 WatchSource:0}: Error finding container 10501ca1f12bfcec86be193f6ebf3ac2f789d5cf0b854dd814e59fa632a88e78: Status 404 returned error can't find the container with id 10501ca1f12bfcec86be193f6ebf3ac2f789d5cf0b854dd814e59fa632a88e78 Apr 22 17:42:29.956100 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:29.956018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" event={"ID":"39c2023b-c1f1-4565-b0f2-6d59d11c63fb","Type":"ContainerStarted","Data":"10501ca1f12bfcec86be193f6ebf3ac2f789d5cf0b854dd814e59fa632a88e78"} Apr 22 17:42:31.963192 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:31.963151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" event={"ID":"9df2d9fa-e684-4659-b50b-b67893063e24","Type":"ContainerStarted","Data":"f84355b427a991c6830ee0ac0259d5cb1db9a462e4191e1016dd3e282e6abb84"} Apr 22 17:42:31.963652 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:31.963216 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:31.964527 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:31.964505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" event={"ID":"39c2023b-c1f1-4565-b0f2-6d59d11c63fb","Type":"ContainerStarted","Data":"a072ec9bad765bbd0adda6ee6a61ad28b8c1de594ff2f3932266353417cd32e6"} Apr 22 17:42:31.980841 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:31.980777 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" podStartSLOduration=1.350157784 podStartE2EDuration="3.980758352s" podCreationTimestamp="2026-04-22 17:42:28 +0000 UTC" firstStartedPulling="2026-04-22 17:42:28.693774199 +0000 UTC m=+504.675420125" lastFinishedPulling="2026-04-22 17:42:31.324374749 +0000 UTC m=+507.306020693" observedRunningTime="2026-04-22 17:42:31.980068398 +0000 UTC m=+507.961714346" watchObservedRunningTime="2026-04-22 17:42:31.980758352 +0000 UTC m=+507.962404301" Apr 22 17:42:31.997085 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:31.997042 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-q2vf9" podStartSLOduration=1.460827733 podStartE2EDuration="2.997028141s" podCreationTimestamp="2026-04-22 17:42:29 +0000 UTC" firstStartedPulling="2026-04-22 17:42:29.788999772 +0000 UTC m=+505.770645699" lastFinishedPulling="2026-04-22 17:42:31.32520018 +0000 UTC m=+507.306846107" observedRunningTime="2026-04-22 17:42:31.995544342 +0000 UTC m=+507.977190301" watchObservedRunningTime="2026-04-22 17:42:31.997028141 +0000 UTC m=+507.978674092" Apr 22 17:42:37.969780 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:37.969746 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-d58kb" Apr 22 17:42:46.268122 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.268087 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gg675"] Apr 22 17:42:46.274540 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.274519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.277416 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.277238 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-sztdl\"" Apr 22 17:42:46.278663 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.278634 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gg675"] Apr 22 17:42:46.287901 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.287875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-bound-sa-token\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.288027 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.287965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcpp4\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-kube-api-access-hcpp4\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.388769 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.388743 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcpp4\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-kube-api-access-hcpp4\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.388899 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.388783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-bound-sa-token\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.396307 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.396280 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-bound-sa-token\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.396418 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.396399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcpp4\" (UniqueName: \"kubernetes.io/projected/d8ed0ce2-174a-4b8c-a46e-895c55b646c0-kube-api-access-hcpp4\") pod \"cert-manager-79c8d999ff-gg675\" (UID: \"d8ed0ce2-174a-4b8c-a46e-895c55b646c0\") " pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.584777 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.584755 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-gg675" Apr 22 17:42:46.703988 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:46.703956 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gg675"] Apr 22 17:42:46.706890 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:42:46.706857 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8ed0ce2_174a_4b8c_a46e_895c55b646c0.slice/crio-dc51d044bb30cd62f821af6d69f48d1deb04135765080dbaa7babd60d405ed8f WatchSource:0}: Error finding container dc51d044bb30cd62f821af6d69f48d1deb04135765080dbaa7babd60d405ed8f: Status 404 returned error can't find the container with id dc51d044bb30cd62f821af6d69f48d1deb04135765080dbaa7babd60d405ed8f Apr 22 17:42:47.003986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:47.003879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-gg675" event={"ID":"d8ed0ce2-174a-4b8c-a46e-895c55b646c0","Type":"ContainerStarted","Data":"808cba123703d2b38dc2aa74c3b1ad5d100e95cc1973805b2e162b4d152521f8"} Apr 22 17:42:47.003986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:47.003915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-gg675" event={"ID":"d8ed0ce2-174a-4b8c-a46e-895c55b646c0","Type":"ContainerStarted","Data":"dc51d044bb30cd62f821af6d69f48d1deb04135765080dbaa7babd60d405ed8f"} Apr 22 17:42:47.030388 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:42:47.030339 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-gg675" podStartSLOduration=1.030325254 podStartE2EDuration="1.030325254s" podCreationTimestamp="2026-04-22 17:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:42:47.028079735 +0000 UTC m=+523.009725685" watchObservedRunningTime="2026-04-22 17:42:47.030325254 +0000 UTC m=+523.011971203" Apr 22 17:43:25.159488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.159397 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:43:25.162319 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.162301 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.165055 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 17:43:25.165055 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165030 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 17:43:25.165333 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165270 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:43:25.165398 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165358 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 17:43:25.165479 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165417 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-7nm8j\"" Apr 22 17:43:25.165573 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.165557 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 17:43:25.166345 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.166329 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:43:25.173119 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.173092 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:43:25.280577 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvt8\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280577 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280604 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.280783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.280746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381219 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvt8\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.381713 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.381425 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.382065 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.382039 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.383726 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.383702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.383806 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.383745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.383885 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.383869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.383920 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.383880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.389962 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.389910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvt8\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.390122 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.390099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gn82g\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.471897 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.471801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:25.616299 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:25.616267 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:43:25.619615 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:43:25.619589 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf3aa698_d3a4_4754_8b44_26b02e966f76.slice/crio-75b94b189f5e3f474028a6017d893be2012bac455234a3ec6b70d051cae38367 WatchSource:0}: Error finding container 75b94b189f5e3f474028a6017d893be2012bac455234a3ec6b70d051cae38367: Status 404 returned error can't find the container with id 75b94b189f5e3f474028a6017d893be2012bac455234a3ec6b70d051cae38367 Apr 22 17:43:26.118399 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:26.118362 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" event={"ID":"af3aa698-d3a4-4754-8b44-26b02e966f76","Type":"ContainerStarted","Data":"75b94b189f5e3f474028a6017d893be2012bac455234a3ec6b70d051cae38367"} Apr 22 17:43:28.112801 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:28.112757 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:43:28.113047 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:28.112842 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:43:29.127198 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:29.127160 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" event={"ID":"af3aa698-d3a4-4754-8b44-26b02e966f76","Type":"ContainerStarted","Data":"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557"} Apr 22 17:43:29.127681 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:29.127299 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:29.128901 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:29.128877 2568 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-gn82g container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 17:43:29.129034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:29.128923 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:43:29.147996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:29.147582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" podStartSLOduration=1.656517925 podStartE2EDuration="4.147546874s" podCreationTimestamp="2026-04-22 17:43:25 +0000 UTC" firstStartedPulling="2026-04-22 17:43:25.621496839 +0000 UTC m=+561.603142766" lastFinishedPulling="2026-04-22 17:43:28.112525774 +0000 UTC m=+564.094171715" observedRunningTime="2026-04-22 17:43:29.146380166 +0000 UTC m=+565.128026133" watchObservedRunningTime="2026-04-22 17:43:29.147546874 +0000 UTC m=+565.129192824" Apr 22 17:43:30.131001 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:30.130971 2568 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-gn82g container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 17:43:30.131361 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:30.131034 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:43:33.131401 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:33.131369 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:43:52.357571 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.357533 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dmcng"] Apr 22 17:43:52.360755 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.360740 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:52.363808 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.363783 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 17:43:52.364043 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.364024 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 17:43:52.364665 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.364650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-cnj74\"" Apr 22 17:43:52.373320 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.373300 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dmcng"] Apr 22 17:43:52.492842 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.492804 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf6pc\" (UniqueName: \"kubernetes.io/projected/dc1b9d03-8eea-42c8-bf4c-980377435abd-kube-api-access-hf6pc\") pod \"authorino-operator-7587b89b76-dmcng\" (UID: \"dc1b9d03-8eea-42c8-bf4c-980377435abd\") " pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:52.593783 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.593751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf6pc\" (UniqueName: \"kubernetes.io/projected/dc1b9d03-8eea-42c8-bf4c-980377435abd-kube-api-access-hf6pc\") pod \"authorino-operator-7587b89b76-dmcng\" (UID: \"dc1b9d03-8eea-42c8-bf4c-980377435abd\") " pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:52.605581 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.605558 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf6pc\" (UniqueName: \"kubernetes.io/projected/dc1b9d03-8eea-42c8-bf4c-980377435abd-kube-api-access-hf6pc\") pod \"authorino-operator-7587b89b76-dmcng\" (UID: \"dc1b9d03-8eea-42c8-bf4c-980377435abd\") " pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:52.670595 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.670518 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:52.810122 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:52.810093 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-dmcng"] Apr 22 17:43:52.813266 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:43:52.813236 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1b9d03_8eea_42c8_bf4c_980377435abd.slice/crio-2b024215f7653cd72dacc372288f3fb5123f511391bdab68cfe908a6a3380884 WatchSource:0}: Error finding container 2b024215f7653cd72dacc372288f3fb5123f511391bdab68cfe908a6a3380884: Status 404 returned error can't find the container with id 2b024215f7653cd72dacc372288f3fb5123f511391bdab68cfe908a6a3380884 Apr 22 17:43:53.196315 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:53.196279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" event={"ID":"dc1b9d03-8eea-42c8-bf4c-980377435abd","Type":"ContainerStarted","Data":"2b024215f7653cd72dacc372288f3fb5123f511391bdab68cfe908a6a3380884"} Apr 22 17:43:54.564488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.564456 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b"] Apr 22 17:43:54.568862 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.568744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:54.571788 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.571762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-tj8cp\"" Apr 22 17:43:54.580496 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.580471 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b"] Apr 22 17:43:54.711723 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.711676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz5w8\" (UniqueName: \"kubernetes.io/projected/698e6b0e-c4d5-4d1f-a178-962254f68e1c-kube-api-access-kz5w8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rk74b\" (UID: \"698e6b0e-c4d5-4d1f-a178-962254f68e1c\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:54.812787 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.812742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz5w8\" (UniqueName: \"kubernetes.io/projected/698e6b0e-c4d5-4d1f-a178-962254f68e1c-kube-api-access-kz5w8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rk74b\" (UID: \"698e6b0e-c4d5-4d1f-a178-962254f68e1c\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:54.822865 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.822764 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz5w8\" (UniqueName: \"kubernetes.io/projected/698e6b0e-c4d5-4d1f-a178-962254f68e1c-kube-api-access-kz5w8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-rk74b\" (UID: \"698e6b0e-c4d5-4d1f-a178-962254f68e1c\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:54.882907 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:54.882868 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:55.030912 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:55.030886 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b"] Apr 22 17:43:55.034018 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:43:55.033982 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698e6b0e_c4d5_4d1f_a178_962254f68e1c.slice/crio-221c1a500eb336e09421d7676399b1941e64fe9ac481db64c7dbceac2b95a915 WatchSource:0}: Error finding container 221c1a500eb336e09421d7676399b1941e64fe9ac481db64c7dbceac2b95a915: Status 404 returned error can't find the container with id 221c1a500eb336e09421d7676399b1941e64fe9ac481db64c7dbceac2b95a915 Apr 22 17:43:55.205242 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:55.205211 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" event={"ID":"698e6b0e-c4d5-4d1f-a178-962254f68e1c","Type":"ContainerStarted","Data":"221c1a500eb336e09421d7676399b1941e64fe9ac481db64c7dbceac2b95a915"} Apr 22 17:43:57.214357 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.214320 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" event={"ID":"dc1b9d03-8eea-42c8-bf4c-980377435abd","Type":"ContainerStarted","Data":"9afe4ef1585e8f43801e0430af1411c45d7e7ef638908463c0a48acd44575ed0"} Apr 22 17:43:57.214830 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.214403 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:43:57.215774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.215749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" event={"ID":"698e6b0e-c4d5-4d1f-a178-962254f68e1c","Type":"ContainerStarted","Data":"ba6ef2bdfed4e200561e5ea12e596a2b7422aca0ca8372f3a36f4b6bc7b74a22"} Apr 22 17:43:57.215896 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.215880 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:43:57.233497 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.233439 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" podStartSLOduration=1.846508051 podStartE2EDuration="5.233419784s" podCreationTimestamp="2026-04-22 17:43:52 +0000 UTC" firstStartedPulling="2026-04-22 17:43:52.815320257 +0000 UTC m=+588.796966184" lastFinishedPulling="2026-04-22 17:43:56.202231977 +0000 UTC m=+592.183877917" observedRunningTime="2026-04-22 17:43:57.232583686 +0000 UTC m=+593.214229636" watchObservedRunningTime="2026-04-22 17:43:57.233419784 +0000 UTC m=+593.215065732" Apr 22 17:43:57.256302 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:43:57.256247 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" podStartSLOduration=1.244248461 podStartE2EDuration="3.256231037s" podCreationTimestamp="2026-04-22 17:43:54 +0000 UTC" firstStartedPulling="2026-04-22 17:43:55.036481053 +0000 UTC m=+591.018126981" lastFinishedPulling="2026-04-22 17:43:57.048463619 +0000 UTC m=+593.030109557" observedRunningTime="2026-04-22 17:43:57.255265925 +0000 UTC m=+593.236911896" watchObservedRunningTime="2026-04-22 17:43:57.256231037 +0000 UTC m=+593.237877034" Apr 22 17:44:00.386089 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.386056 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v"] Apr 22 17:44:00.389283 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.389264 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.391614 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.391595 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-msjkn\"" Apr 22 17:44:00.401123 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.401096 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v"] Apr 22 17:44:00.559628 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.559593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c78527fa-b7db-4dd8-909a-b848cca5ac08-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.559809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.559654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fv6\" (UniqueName: \"kubernetes.io/projected/c78527fa-b7db-4dd8-909a-b848cca5ac08-kube-api-access-w4fv6\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.660833 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.660736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c78527fa-b7db-4dd8-909a-b848cca5ac08-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.661020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.660840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fv6\" (UniqueName: \"kubernetes.io/projected/c78527fa-b7db-4dd8-909a-b848cca5ac08-kube-api-access-w4fv6\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.661311 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.661289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c78527fa-b7db-4dd8-909a-b848cca5ac08-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.670547 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.670518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fv6\" (UniqueName: \"kubernetes.io/projected/c78527fa-b7db-4dd8-909a-b848cca5ac08-kube-api-access-w4fv6\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-7cv4v\" (UID: \"c78527fa-b7db-4dd8-909a-b848cca5ac08\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.699268 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.699232 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:00.817701 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:00.817638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v"] Apr 22 17:44:00.820555 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:44:00.820525 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc78527fa_b7db_4dd8_909a_b848cca5ac08.slice/crio-33b4c73bb518e834005fc3747d9ee00d5a036eb30ea2c23a22b106acdd25f93e WatchSource:0}: Error finding container 33b4c73bb518e834005fc3747d9ee00d5a036eb30ea2c23a22b106acdd25f93e: Status 404 returned error can't find the container with id 33b4c73bb518e834005fc3747d9ee00d5a036eb30ea2c23a22b106acdd25f93e Apr 22 17:44:01.229911 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:01.229875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" event={"ID":"c78527fa-b7db-4dd8-909a-b848cca5ac08","Type":"ContainerStarted","Data":"33b4c73bb518e834005fc3747d9ee00d5a036eb30ea2c23a22b106acdd25f93e"} Apr 22 17:44:04.515580 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:04.515547 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:44:04.515986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:04.515606 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:44:05.245334 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:05.245294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" event={"ID":"c78527fa-b7db-4dd8-909a-b848cca5ac08","Type":"ContainerStarted","Data":"d239c6dbcc0ac14eecad6fe0e4fc92e8d4f0d6ddf9e1c0c51d78de6b51b31a86"} Apr 22 17:44:05.245484 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:05.245426 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:44:05.264630 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:05.264569 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" podStartSLOduration=1.519013559 podStartE2EDuration="5.264553064s" podCreationTimestamp="2026-04-22 17:44:00 +0000 UTC" firstStartedPulling="2026-04-22 17:44:00.823048396 +0000 UTC m=+596.804694325" lastFinishedPulling="2026-04-22 17:44:04.568587903 +0000 UTC m=+600.550233830" observedRunningTime="2026-04-22 17:44:05.263488001 +0000 UTC m=+601.245133951" watchObservedRunningTime="2026-04-22 17:44:05.264553064 +0000 UTC m=+601.246199013" Apr 22 17:44:08.221890 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:08.221857 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-dmcng" Apr 22 17:44:08.222293 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:08.222029 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-rk74b" Apr 22 17:44:16.251482 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:44:16.251447 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-7cv4v" Apr 22 17:45:28.303172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.303137 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4"] Apr 22 17:45:28.306316 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.306297 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.315813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.315791 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4"] Apr 22 17:45:28.338292 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338348 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24btn\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-kube-api-access-24btn\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338430 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338550 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338550 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338528 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.338624 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.338554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.438897 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.438866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439089 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.438908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439089 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.438927 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24btn\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-kube-api-access-24btn\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439089 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.438988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439261 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.439123 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439261 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.439168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439261 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.439191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.439929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.439902 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.441382 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.441350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.441488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.441383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.441488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.441403 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.441488 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.441410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.447254 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.447234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24btn\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-kube-api-access-24btn\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.447363 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.447341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af5eb2e4-1b98-43d1-98e5-57c92f8736ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vtxh4\" (UID: \"af5eb2e4-1b98-43d1-98e5-57c92f8736ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.615153 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.615126 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:28.740027 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.739910 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4"] Apr 22 17:45:28.742669 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:45:28.742635 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf5eb2e4_1b98_43d1_98e5_57c92f8736ee.slice/crio-e71a94530734f53f2b6c1f0343ce31a916f4a299a5207c96c9d591f9817b4b8a WatchSource:0}: Error finding container e71a94530734f53f2b6c1f0343ce31a916f4a299a5207c96c9d591f9817b4b8a: Status 404 returned error can't find the container with id e71a94530734f53f2b6c1f0343ce31a916f4a299a5207c96c9d591f9817b4b8a Apr 22 17:45:28.744725 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.744697 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:45:28.744806 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:28.744766 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:45:29.511715 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.511681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" event={"ID":"af5eb2e4-1b98-43d1-98e5-57c92f8736ee","Type":"ContainerStarted","Data":"4b255ab743ab38d3232c357c3e343e692d1d2e47293463139a99eb7faa393453"} Apr 22 17:45:29.511715 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.511717 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" event={"ID":"af5eb2e4-1b98-43d1-98e5-57c92f8736ee","Type":"ContainerStarted","Data":"e71a94530734f53f2b6c1f0343ce31a916f4a299a5207c96c9d591f9817b4b8a"} Apr 22 17:45:29.512175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.512010 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:29.513512 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.513488 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" Apr 22 17:45:29.542024 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.541972 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vtxh4" podStartSLOduration=1.541956705 podStartE2EDuration="1.541956705s" podCreationTimestamp="2026-04-22 17:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:45:29.536301735 +0000 UTC m=+685.517947688" watchObservedRunningTime="2026-04-22 17:45:29.541956705 +0000 UTC m=+685.523602653" Apr 22 17:45:29.603175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.603135 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:45:29.603517 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.603489 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" containerID="cri-o://5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557" gracePeriod=30 Apr 22 17:45:29.846071 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.846050 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:45:29.850848 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.850829 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.850907 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.850863 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvt8\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.850907 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.850880 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.851005 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.850909 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.851005 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.850958 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.851005 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.851000 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.851142 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.851046 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs\") pod \"af3aa698-d3a4-4754-8b44-26b02e966f76\" (UID: \"af3aa698-d3a4-4754-8b44-26b02e966f76\") " Apr 22 17:45:29.851307 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.851264 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:45:29.853286 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853260 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs" (OuterVolumeSpecName: "local-certs") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:45:29.853286 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8" (OuterVolumeSpecName: "kube-api-access-vtvt8") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "kube-api-access-vtvt8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:45:29.853530 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853498 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token" (OuterVolumeSpecName: "istio-token") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:45:29.853530 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853521 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:45:29.853694 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853557 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:45:29.853694 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.853571 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts" (OuterVolumeSpecName: "cacerts") pod "af3aa698-d3a4-4754-8b44-26b02e966f76" (UID: "af3aa698-d3a4-4754-8b44-26b02e966f76"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:45:29.951664 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951631 2568 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-kubeconfig\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951664 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951661 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtvt8\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-kube-api-access-vtvt8\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951664 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951673 2568 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-ca-configmap\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951867 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951682 2568 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-csr-dns-cert\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951867 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951691 2568 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/af3aa698-d3a4-4754-8b44-26b02e966f76-cacerts\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951867 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951700 2568 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/af3aa698-d3a4-4754-8b44-26b02e966f76-istio-token\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:29.951867 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:29.951709 2568 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/af3aa698-d3a4-4754-8b44-26b02e966f76-local-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:45:30.516589 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.516555 2568 generic.go:358] "Generic (PLEG): container finished" podID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerID="5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557" exitCode=0 Apr 22 17:45:30.517094 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.516618 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" Apr 22 17:45:30.517094 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.516646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" event={"ID":"af3aa698-d3a4-4754-8b44-26b02e966f76","Type":"ContainerDied","Data":"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557"} Apr 22 17:45:30.517094 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.516695 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g" event={"ID":"af3aa698-d3a4-4754-8b44-26b02e966f76","Type":"ContainerDied","Data":"75b94b189f5e3f474028a6017d893be2012bac455234a3ec6b70d051cae38367"} Apr 22 17:45:30.517094 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.516718 2568 scope.go:117] "RemoveContainer" containerID="5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557" Apr 22 17:45:30.524616 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.524600 2568 scope.go:117] "RemoveContainer" containerID="5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557" Apr 22 17:45:30.524829 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:45:30.524812 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557\": container with ID starting with 5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557 not found: ID does not exist" containerID="5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557" Apr 22 17:45:30.524888 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.524834 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557"} err="failed to get container status \"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557\": rpc error: code = NotFound desc = could not find container \"5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557\": container with ID starting with 5648af428704206035643458bf331260aec76d934a842d61d91975825f3fc557 not found: ID does not exist" Apr 22 17:45:30.539053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.539030 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:45:30.543507 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:30.543489 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gn82g"] Apr 22 17:45:32.532831 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:32.532784 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" path="/var/lib/kubelet/pods/af3aa698-d3a4-4754-8b44-26b02e966f76/volumes" Apr 22 17:45:36.580480 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.580449 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-tg77x"] Apr 22 17:45:36.580903 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.580712 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" Apr 22 17:45:36.580903 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.580723 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" Apr 22 17:45:36.580903 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.580773 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="af3aa698-d3a4-4754-8b44-26b02e966f76" containerName="discovery" Apr 22 17:45:36.584903 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.584883 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.587256 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.587210 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 17:45:36.587390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.587217 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:45:36.588086 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.588070 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:45:36.588219 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.588131 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-27gmk\"" Apr 22 17:45:36.594557 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.594536 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tg77x"] Apr 22 17:45:36.596643 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.596620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d8e023c5-7fd2-487c-a12d-6fb177d643f6-data\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.596741 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.596675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zx5\" (UniqueName: \"kubernetes.io/projected/d8e023c5-7fd2-487c-a12d-6fb177d643f6-kube-api-access-r7zx5\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.697303 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.697268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zx5\" (UniqueName: \"kubernetes.io/projected/d8e023c5-7fd2-487c-a12d-6fb177d643f6-kube-api-access-r7zx5\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.697470 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.697326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d8e023c5-7fd2-487c-a12d-6fb177d643f6-data\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.697659 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.697644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/d8e023c5-7fd2-487c-a12d-6fb177d643f6-data\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.705223 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.705195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zx5\" (UniqueName: \"kubernetes.io/projected/d8e023c5-7fd2-487c-a12d-6fb177d643f6-kube-api-access-r7zx5\") pod \"seaweedfs-86cc847c5c-tg77x\" (UID: \"d8e023c5-7fd2-487c-a12d-6fb177d643f6\") " pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:36.894771 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:36.894678 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:37.014897 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:37.014865 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tg77x"] Apr 22 17:45:37.017796 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:45:37.017767 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e023c5_7fd2_487c_a12d_6fb177d643f6.slice/crio-32fd9d2ad803bce783a33c25b964694c30fe1e6ddb27a4f45a2e3ba04271874f WatchSource:0}: Error finding container 32fd9d2ad803bce783a33c25b964694c30fe1e6ddb27a4f45a2e3ba04271874f: Status 404 returned error can't find the container with id 32fd9d2ad803bce783a33c25b964694c30fe1e6ddb27a4f45a2e3ba04271874f Apr 22 17:45:37.541775 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:37.541729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tg77x" event={"ID":"d8e023c5-7fd2-487c-a12d-6fb177d643f6","Type":"ContainerStarted","Data":"32fd9d2ad803bce783a33c25b964694c30fe1e6ddb27a4f45a2e3ba04271874f"} Apr 22 17:45:40.552234 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:40.552196 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tg77x" event={"ID":"d8e023c5-7fd2-487c-a12d-6fb177d643f6","Type":"ContainerStarted","Data":"654d6a30ba5a56a8cee50d1cf31eebae3f3875ab42831f24afb73dbffcb3c5d8"} Apr 22 17:45:40.552639 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:40.552308 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:45:40.567870 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:40.567820 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-tg77x" podStartSLOduration=2.050351713 podStartE2EDuration="4.567806172s" podCreationTimestamp="2026-04-22 17:45:36 +0000 UTC" firstStartedPulling="2026-04-22 17:45:37.019064911 +0000 UTC m=+693.000710838" lastFinishedPulling="2026-04-22 17:45:39.536519356 +0000 UTC m=+695.518165297" observedRunningTime="2026-04-22 17:45:40.566790029 +0000 UTC m=+696.548435975" watchObservedRunningTime="2026-04-22 17:45:40.567806172 +0000 UTC m=+696.549452121" Apr 22 17:45:46.558653 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:45:46.558624 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-tg77x" Apr 22 17:46:47.129857 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.129819 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-8g9bd"] Apr 22 17:46:47.133114 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.133092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.136391 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.136371 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 17:46:47.136515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.136409 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2fhbk\"" Apr 22 17:46:47.142608 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.142587 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8g9bd"] Apr 22 17:46:47.144308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.144291 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.144408 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.144346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqngg\" (UniqueName: \"kubernetes.io/projected/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-kube-api-access-pqngg\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.245196 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.245163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqngg\" (UniqueName: \"kubernetes.io/projected/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-kube-api-access-pqngg\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.245340 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.245210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.245340 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:46:47.245310 2568 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 17:46:47.245411 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:46:47.245382 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert podName:ef2d974f-7c57-4dd1-8c0c-45d19af7904c nodeName:}" failed. No retries permitted until 2026-04-22 17:46:47.745361825 +0000 UTC m=+763.727007768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert") pod "odh-model-controller-696fc77849-8g9bd" (UID: "ef2d974f-7c57-4dd1-8c0c-45d19af7904c") : secret "odh-model-controller-webhook-cert" not found Apr 22 17:46:47.257102 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.257078 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqngg\" (UniqueName: \"kubernetes.io/projected/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-kube-api-access-pqngg\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.749606 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.749574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:47.752014 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:47.751993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef2d974f-7c57-4dd1-8c0c-45d19af7904c-cert\") pod \"odh-model-controller-696fc77849-8g9bd\" (UID: \"ef2d974f-7c57-4dd1-8c0c-45d19af7904c\") " pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:48.044393 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:48.044307 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:48.163165 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:48.163135 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8g9bd"] Apr 22 17:46:48.166031 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:46:48.166004 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef2d974f_7c57_4dd1_8c0c_45d19af7904c.slice/crio-f5f2495ba4fded32de1e71ae5eb5585492d413d79cf1802256ddf37c479a44ab WatchSource:0}: Error finding container f5f2495ba4fded32de1e71ae5eb5585492d413d79cf1802256ddf37c479a44ab: Status 404 returned error can't find the container with id f5f2495ba4fded32de1e71ae5eb5585492d413d79cf1802256ddf37c479a44ab Apr 22 17:46:48.762987 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:48.762949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8g9bd" event={"ID":"ef2d974f-7c57-4dd1-8c0c-45d19af7904c","Type":"ContainerStarted","Data":"f5f2495ba4fded32de1e71ae5eb5585492d413d79cf1802256ddf37c479a44ab"} Apr 22 17:46:51.777006 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:51.776963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8g9bd" event={"ID":"ef2d974f-7c57-4dd1-8c0c-45d19af7904c","Type":"ContainerStarted","Data":"256d18b1cf5646a81e41a5afdf7fb1e3acb744349128e564d513ed91b783646d"} Apr 22 17:46:51.777403 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:51.777031 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:46:51.792753 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:46:51.792650 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-8g9bd" podStartSLOduration=2.099643827 podStartE2EDuration="4.79263273s" podCreationTimestamp="2026-04-22 17:46:47 +0000 UTC" firstStartedPulling="2026-04-22 17:46:48.16731326 +0000 UTC m=+764.148959186" lastFinishedPulling="2026-04-22 17:46:50.860302154 +0000 UTC m=+766.841948089" observedRunningTime="2026-04-22 17:46:51.79209741 +0000 UTC m=+767.773743361" watchObservedRunningTime="2026-04-22 17:46:51.79263273 +0000 UTC m=+767.774278680" Apr 22 17:47:02.782172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:02.782143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-8g9bd" Apr 22 17:47:23.918379 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.918298 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2"] Apr 22 17:47:23.922121 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.922091 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:23.924710 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.924686 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 17:47:23.924832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.924812 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 17:47:23.925127 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.925109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-gnmws\"" Apr 22 17:47:23.925234 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.925108 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 17:47:23.932463 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:23.932443 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2"] Apr 22 17:47:24.033693 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vn6\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-kube-api-access-z6vn6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033693 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033717 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033741 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c7faed16-9523-4c8f-b29d-6c0555986b88-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033883 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.033929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.034248 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.033984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135310 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135496 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135496 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135602 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135602 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vn6\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-kube-api-access-z6vn6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135602 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135589 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135749 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135749 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135749 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135892 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135892 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c7faed16-9523-4c8f-b29d-6c0555986b88-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.135892 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.136055 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.135876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.136411 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.136391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c7faed16-9523-4c8f-b29d-6c0555986b88-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.138104 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.138085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.138354 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.138334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.145062 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.145026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.145350 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.145333 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vn6\" (UniqueName: \"kubernetes.io/projected/c7faed16-9523-4c8f-b29d-6c0555986b88-kube-api-access-z6vn6\") pod \"router-gateway-1-openshift-default-6c59fbf55c-q8cf2\" (UID: \"c7faed16-9523-4c8f-b29d-6c0555986b88\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.233805 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.233735 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:24.364255 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.364230 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2"] Apr 22 17:47:24.366991 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:47:24.366963 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7faed16_9523_4c8f_b29d_6c0555986b88.slice/crio-3a3790e6581d8854578a95077643322e6109af86fdab9c29772c1efcf1335c91 WatchSource:0}: Error finding container 3a3790e6581d8854578a95077643322e6109af86fdab9c29772c1efcf1335c91: Status 404 returned error can't find the container with id 3a3790e6581d8854578a95077643322e6109af86fdab9c29772c1efcf1335c91 Apr 22 17:47:24.369150 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.369131 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:47:24.887768 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:24.887731 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" event={"ID":"c7faed16-9523-4c8f-b29d-6c0555986b88","Type":"ContainerStarted","Data":"3a3790e6581d8854578a95077643322e6109af86fdab9c29772c1efcf1335c91"} Apr 22 17:47:26.906098 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:26.906057 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:47:26.906364 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:26.906131 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:47:26.906364 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:26.906164 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:47:27.902075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:27.902042 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" event={"ID":"c7faed16-9523-4c8f-b29d-6c0555986b88","Type":"ContainerStarted","Data":"c589a3268e42e698abc0c849bb091f9b80a4449bcd8d81a76c2472f9287eb406"} Apr 22 17:47:27.925502 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:27.925460 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" podStartSLOduration=2.388881981 podStartE2EDuration="4.925445959s" podCreationTimestamp="2026-04-22 17:47:23 +0000 UTC" firstStartedPulling="2026-04-22 17:47:24.369259471 +0000 UTC m=+800.350905398" lastFinishedPulling="2026-04-22 17:47:26.905823449 +0000 UTC m=+802.887469376" observedRunningTime="2026-04-22 17:47:27.924306562 +0000 UTC m=+803.905952511" watchObservedRunningTime="2026-04-22 17:47:27.925445959 +0000 UTC m=+803.907091908" Apr 22 17:47:28.234959 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:28.234843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:28.239596 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:28.239576 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:28.905459 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:28.905425 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:28.906587 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:28.906566 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-q8cf2" Apr 22 17:47:37.334851 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.334818 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:47:37.338614 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.338586 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.341255 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.341234 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:47:37.341377 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.341266 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 17:47:37.341874 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.341857 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-x7q7j\"" Apr 22 17:47:37.349859 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.349835 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:47:37.446809 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.446774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.447021 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.446831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.447021 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.446909 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5bt\" (UniqueName: \"kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.447021 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.446998 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.447172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.447047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.447172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.447086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.547871 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.547830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548110 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.547889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548110 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.547931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548110 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548110 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548110 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5bt\" (UniqueName: \"kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548385 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548444 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548519 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.548658 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.548642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.550598 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.550575 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.556911 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.556883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5bt\" (UniqueName: \"kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.648733 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.648643 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:47:37.776996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.776971 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:47:37.779382 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:47:37.779345 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e4b039_fd60_4bcc_9215_dc4a701a92f9.slice/crio-cdd14d81a845fb495cc629508a61ba29f343d1724529428820efb6704fcebba7 WatchSource:0}: Error finding container cdd14d81a845fb495cc629508a61ba29f343d1724529428820efb6704fcebba7: Status 404 returned error can't find the container with id cdd14d81a845fb495cc629508a61ba29f343d1724529428820efb6704fcebba7 Apr 22 17:47:37.941683 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:37.941592 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerStarted","Data":"cdd14d81a845fb495cc629508a61ba29f343d1724529428820efb6704fcebba7"} Apr 22 17:47:41.957986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:41.957934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerStarted","Data":"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0"} Apr 22 17:47:42.962533 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:42.962442 2568 generic.go:358] "Generic (PLEG): container finished" podID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerID="a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0" exitCode=0 Apr 22 17:47:42.962877 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:42.962527 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerDied","Data":"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0"} Apr 22 17:47:44.972646 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:47:44.972608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerStarted","Data":"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0"} Apr 22 17:48:15.089990 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:15.089952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerStarted","Data":"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650"} Apr 22 17:48:15.090401 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:15.090181 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:15.092989 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:15.092969 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:15.119889 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:15.119832 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" podStartSLOduration=1.30130375 podStartE2EDuration="38.119816894s" podCreationTimestamp="2026-04-22 17:47:37 +0000 UTC" firstStartedPulling="2026-04-22 17:47:37.781161572 +0000 UTC m=+813.762807503" lastFinishedPulling="2026-04-22 17:48:14.599674706 +0000 UTC m=+850.581320647" observedRunningTime="2026-04-22 17:48:15.118920075 +0000 UTC m=+851.100566033" watchObservedRunningTime="2026-04-22 17:48:15.119816894 +0000 UTC m=+851.101462844" Apr 22 17:48:17.649688 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:17.649651 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:17.650128 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:17.649704 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:27.651405 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:27.651375 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:27.652667 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:27.652643 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:28.793514 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:28.793478 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:48:29.135116 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:29.135021 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="main" containerID="cri-o://2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0" gracePeriod=30 Apr 22 17:48:29.135116 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:29.135039 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="tokenizer" containerID="cri-o://00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650" gracePeriod=30 Apr 22 17:48:30.141216 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.141165 2568 generic.go:358] "Generic (PLEG): container finished" podID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerID="2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0" exitCode=0 Apr 22 17:48:30.141591 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.141240 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerDied","Data":"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0"} Apr 22 17:48:30.481480 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.481456 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:30.616848 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.616810 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.616861 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.616900 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.616925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj5bt\" (UniqueName: \"kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617030 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617279 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp\") pod \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\" (UID: \"38e4b039-fd60-4bcc-9215-dc4a701a92f9\") " Apr 22 17:48:30.617279 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617152 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:30.617279 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617212 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:30.617465 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617439 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:30.617465 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617469 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:30.617644 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617506 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:30.617798 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.617777 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:48:30.619172 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.619157 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:48:30.619309 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.619290 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt" (OuterVolumeSpecName: "kube-api-access-pj5bt") pod "38e4b039-fd60-4bcc-9215-dc4a701a92f9" (UID: "38e4b039-fd60-4bcc-9215-dc4a701a92f9"). InnerVolumeSpecName "kube-api-access-pj5bt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:48:30.718048 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.717958 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:30.718048 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.717987 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:30.718048 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.717998 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38e4b039-fd60-4bcc-9215-dc4a701a92f9-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:30.718048 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:30.718006 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pj5bt\" (UniqueName: \"kubernetes.io/projected/38e4b039-fd60-4bcc-9215-dc4a701a92f9-kube-api-access-pj5bt\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:48:31.147360 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.147317 2568 generic.go:358] "Generic (PLEG): container finished" podID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerID="00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650" exitCode=0 Apr 22 17:48:31.147800 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.147374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerDied","Data":"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650"} Apr 22 17:48:31.147800 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.147408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" event={"ID":"38e4b039-fd60-4bcc-9215-dc4a701a92f9","Type":"ContainerDied","Data":"cdd14d81a845fb495cc629508a61ba29f343d1724529428820efb6704fcebba7"} Apr 22 17:48:31.147800 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.147428 2568 scope.go:117] "RemoveContainer" containerID="00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650" Apr 22 17:48:31.147800 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.147428 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp" Apr 22 17:48:31.155643 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.155623 2568 scope.go:117] "RemoveContainer" containerID="2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0" Apr 22 17:48:31.162473 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.162456 2568 scope.go:117] "RemoveContainer" containerID="a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0" Apr 22 17:48:31.169060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169041 2568 scope.go:117] "RemoveContainer" containerID="00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650" Apr 22 17:48:31.169289 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:48:31.169268 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650\": container with ID starting with 00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650 not found: ID does not exist" containerID="00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650" Apr 22 17:48:31.169370 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169303 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650"} err="failed to get container status \"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650\": rpc error: code = NotFound desc = could not find container \"00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650\": container with ID starting with 00c8b4dee5567c7e16955ebcce2188c72d3a8a8f3eb05a112c8763e334f59650 not found: ID does not exist" Apr 22 17:48:31.169370 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169328 2568 scope.go:117] "RemoveContainer" containerID="2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0" Apr 22 17:48:31.169586 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:48:31.169568 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0\": container with ID starting with 2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0 not found: ID does not exist" containerID="2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0" Apr 22 17:48:31.169639 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169591 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0"} err="failed to get container status \"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0\": rpc error: code = NotFound desc = could not find container \"2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0\": container with ID starting with 2a6f4812e341a649ec4ab9265f7fe8bf0e8c48ea0c275a17fad3ccf0a87062c0 not found: ID does not exist" Apr 22 17:48:31.169639 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169608 2568 scope.go:117] "RemoveContainer" containerID="a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0" Apr 22 17:48:31.169848 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:48:31.169826 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0\": container with ID starting with a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0 not found: ID does not exist" containerID="a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0" Apr 22 17:48:31.169916 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.169856 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0"} err="failed to get container status \"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0\": rpc error: code = NotFound desc = could not find container \"a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0\": container with ID starting with a74330872ab748f526bac5225319ca4deacd0f015490e3b090651cd0c25746a0 not found: ID does not exist" Apr 22 17:48:31.172185 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.172164 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:48:31.177568 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:31.177547 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6f467dm4gp"] Apr 22 17:48:32.532843 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:32.532814 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" path="/var/lib/kubelet/pods/38e4b039-fd60-4bcc-9215-dc4a701a92f9/volumes" Apr 22 17:48:36.002961 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.002911 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003222 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="main" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003234 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="main" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003242 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="storage-initializer" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003250 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="storage-initializer" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003259 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="tokenizer" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003264 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="tokenizer" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003308 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="tokenizer" Apr 22 17:48:36.003346 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.003318 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38e4b039-fd60-4bcc-9215-dc4a701a92f9" containerName="main" Apr 22 17:48:36.008177 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.008151 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.011059 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.011035 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 17:48:36.011200 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.011044 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:48:36.011596 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.011577 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-dlhwp\"" Apr 22 17:48:36.020536 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.018646 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:48:36.156854 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.156812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.157053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.156865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8wm\" (UniqueName: \"kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.157053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.156978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.157053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.157012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.157053 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.157030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.157438 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.157108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8wm\" (UniqueName: \"kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257700 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.257925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.257789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.258134 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.258112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.258168 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.258127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.258231 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.258204 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.258268 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.258211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.260363 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.260340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.274014 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.273978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8wm\" (UniqueName: \"kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.326690 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.326653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:36.452990 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:36.452958 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:48:36.456856 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:48:36.456827 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf65753_3087_4ead_a95d_c6e58ffa241b.slice/crio-c9aa93df13fdcc8cb015ccb9450b8157a8fe6be7b711a201a2b885f2e995b1aa WatchSource:0}: Error finding container c9aa93df13fdcc8cb015ccb9450b8157a8fe6be7b711a201a2b885f2e995b1aa: Status 404 returned error can't find the container with id c9aa93df13fdcc8cb015ccb9450b8157a8fe6be7b711a201a2b885f2e995b1aa Apr 22 17:48:37.169375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:37.169339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerStarted","Data":"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404"} Apr 22 17:48:37.169375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:37.169379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerStarted","Data":"c9aa93df13fdcc8cb015ccb9450b8157a8fe6be7b711a201a2b885f2e995b1aa"} Apr 22 17:48:38.173517 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:38.173477 2568 generic.go:358] "Generic (PLEG): container finished" podID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerID="7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404" exitCode=0 Apr 22 17:48:38.173914 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:38.173558 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerDied","Data":"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404"} Apr 22 17:48:39.178686 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:39.178647 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerStarted","Data":"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1"} Apr 22 17:48:39.178686 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:39.178684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerStarted","Data":"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a"} Apr 22 17:48:39.179217 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:39.178763 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:39.203210 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:39.203168 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" podStartSLOduration=4.203153132 podStartE2EDuration="4.203153132s" podCreationTimestamp="2026-04-22 17:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:48:39.200311061 +0000 UTC m=+875.181957015" watchObservedRunningTime="2026-04-22 17:48:39.203153132 +0000 UTC m=+875.184799080" Apr 22 17:48:46.327301 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:46.327266 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:46.327883 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:46.327423 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:46.329899 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:46.329872 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:48:47.205297 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:48:47.205266 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:49:04.541256 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:04.541225 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:49:04.541649 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:04.541291 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:49:09.211591 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:09.211563 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:49:22.127312 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.127281 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:22.131795 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.131777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.134958 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.134905 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 17:49:22.152849 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.152817 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:22.240078 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240040 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.240078 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240083 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.240295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.240295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.240295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240256 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvppd\" (UniqueName: \"kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.240295 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.240282 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341052 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341052 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvppd\" (UniqueName: \"kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341267 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341308 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341349 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341394 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341502 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341658 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.341687 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.341668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.343474 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.343430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.343760 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.343737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.353207 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.353184 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvppd\" (UniqueName: \"kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd\") pod \"precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.398591 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.398494 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:22.402524 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.402504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.406248 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.406223 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-v5gsr\"" Apr 22 17:49:22.420460 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.420431 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:22.442654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.442617 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:22.543897 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.543858 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.544096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.543902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.544096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.543996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.544096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.544020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.544096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.544064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.544271 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.544120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzl9m\" (UniqueName: \"kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.584690 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.584658 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:22.588164 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:49:22.588137 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f91946_bf44_4bb1_9ce5_dc4ef78686e2.slice/crio-eecc6262e12e0cba32daf56b47fdee007e43a934a854ca5e1779669a1f9c351c WatchSource:0}: Error finding container eecc6262e12e0cba32daf56b47fdee007e43a934a854ca5e1779669a1f9c351c: Status 404 returned error can't find the container with id eecc6262e12e0cba32daf56b47fdee007e43a934a854ca5e1779669a1f9c351c Apr 22 17:49:22.645352 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645489 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzl9m\" (UniqueName: \"kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645489 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645489 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645660 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645660 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645824 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.645824 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.645817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.646029 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.646007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.646147 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.646124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.648055 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.648035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.661078 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.660994 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzl9m\" (UniqueName: \"kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m\") pod \"precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.712060 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.712017 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:22.854085 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:22.854019 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:22.857642 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:49:22.857614 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c381c8_3f2e_4287_a861_45fd3317b676.slice/crio-e330299c5ae48b945bec8a2515ce2dde83025e917d6d2c0f6d0555574edc0196 WatchSource:0}: Error finding container e330299c5ae48b945bec8a2515ce2dde83025e917d6d2c0f6d0555574edc0196: Status 404 returned error can't find the container with id e330299c5ae48b945bec8a2515ce2dde83025e917d6d2c0f6d0555574edc0196 Apr 22 17:49:23.322510 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.322406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerStarted","Data":"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b"} Apr 22 17:49:23.322510 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.322459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerStarted","Data":"e330299c5ae48b945bec8a2515ce2dde83025e917d6d2c0f6d0555574edc0196"} Apr 22 17:49:23.323849 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.323824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerStarted","Data":"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e"} Apr 22 17:49:23.324014 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.323853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerStarted","Data":"eecc6262e12e0cba32daf56b47fdee007e43a934a854ca5e1779669a1f9c351c"} Apr 22 17:49:23.811547 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.811509 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:49:23.812006 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.811955 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="main" containerID="cri-o://208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a" gracePeriod=30 Apr 22 17:49:23.812139 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:23.812000 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="tokenizer" containerID="cri-o://8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1" gracePeriod=30 Apr 22 17:49:24.328993 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:24.328923 2568 generic.go:358] "Generic (PLEG): container finished" podID="07c381c8-3f2e-4287-a861-45fd3317b676" containerID="1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b" exitCode=0 Apr 22 17:49:24.329475 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:24.329012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerDied","Data":"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b"} Apr 22 17:49:24.338460 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:24.338422 2568 generic.go:358] "Generic (PLEG): container finished" podID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerID="208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a" exitCode=0 Apr 22 17:49:24.338751 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:24.338720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerDied","Data":"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a"} Apr 22 17:49:25.302428 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.302399 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:49:25.345018 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.344975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerStarted","Data":"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3"} Apr 22 17:49:25.345501 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.345020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerStarted","Data":"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43"} Apr 22 17:49:25.345501 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.345143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:25.347320 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.347295 2568 generic.go:358] "Generic (PLEG): container finished" podID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerID="8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1" exitCode=0 Apr 22 17:49:25.347442 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.347334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerDied","Data":"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1"} Apr 22 17:49:25.347442 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.347370 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" Apr 22 17:49:25.347442 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.347384 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb" event={"ID":"fcf65753-3087-4ead-a95d-c6e58ffa241b","Type":"ContainerDied","Data":"c9aa93df13fdcc8cb015ccb9450b8157a8fe6be7b711a201a2b885f2e995b1aa"} Apr 22 17:49:25.347632 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.347485 2568 scope.go:117] "RemoveContainer" containerID="8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1" Apr 22 17:49:25.358098 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.358069 2568 scope.go:117] "RemoveContainer" containerID="208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a" Apr 22 17:49:25.372037 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.371999 2568 scope.go:117] "RemoveContainer" containerID="7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404" Apr 22 17:49:25.380716 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.380685 2568 scope.go:117] "RemoveContainer" containerID="8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1" Apr 22 17:49:25.381184 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:25.381152 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1\": container with ID starting with 8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1 not found: ID does not exist" containerID="8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1" Apr 22 17:49:25.381278 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.381200 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1"} err="failed to get container status \"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1\": rpc error: code = NotFound desc = could not find container \"8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1\": container with ID starting with 8f64ef9456f9813e94da67460b042341461dfdd0974f3ca45451233b36d64ed1 not found: ID does not exist" Apr 22 17:49:25.381278 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.381228 2568 scope.go:117] "RemoveContainer" containerID="208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a" Apr 22 17:49:25.381552 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:25.381530 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a\": container with ID starting with 208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a not found: ID does not exist" containerID="208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a" Apr 22 17:49:25.381654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.381561 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a"} err="failed to get container status \"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a\": rpc error: code = NotFound desc = could not find container \"208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a\": container with ID starting with 208ddae82268b4ee43c6d38ac5c57bcf6bee4ff23366ffb549a4f5416d92049a not found: ID does not exist" Apr 22 17:49:25.381654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.381586 2568 scope.go:117] "RemoveContainer" containerID="7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404" Apr 22 17:49:25.381923 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:25.381898 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404\": container with ID starting with 7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404 not found: ID does not exist" containerID="7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404" Apr 22 17:49:25.382007 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.381932 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404"} err="failed to get container status \"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404\": rpc error: code = NotFound desc = could not find container \"7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404\": container with ID starting with 7f4b64809c54b4f84c5de0eedd5a234c6987d58c26a894bc4d61c06cca1b0404 not found: ID does not exist" Apr 22 17:49:25.412152 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.412094 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" podStartSLOduration=3.412078674 podStartE2EDuration="3.412078674s" podCreationTimestamp="2026-04-22 17:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:49:25.408409595 +0000 UTC m=+921.390055625" watchObservedRunningTime="2026-04-22 17:49:25.412078674 +0000 UTC m=+921.393724623" Apr 22 17:49:25.473403 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473365 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473566 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473418 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd8wm\" (UniqueName: \"kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473566 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473444 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473566 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473519 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473566 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473551 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473785 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473581 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache\") pod \"fcf65753-3087-4ead-a95d-c6e58ffa241b\" (UID: \"fcf65753-3087-4ead-a95d-c6e58ffa241b\") " Apr 22 17:49:25.473908 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473863 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:25.473994 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473919 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:25.474093 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.473981 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:25.474386 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.474363 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.474470 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.474390 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.474470 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.474404 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.474698 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.474668 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:25.476259 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.476238 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm" (OuterVolumeSpecName: "kube-api-access-cd8wm") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "kube-api-access-cd8wm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:25.476554 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.476529 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fcf65753-3087-4ead-a95d-c6e58ffa241b" (UID: "fcf65753-3087-4ead-a95d-c6e58ffa241b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:25.575144 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.575106 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf65753-3087-4ead-a95d-c6e58ffa241b-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.575144 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.575140 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd8wm\" (UniqueName: \"kubernetes.io/projected/fcf65753-3087-4ead-a95d-c6e58ffa241b-kube-api-access-cd8wm\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.575144 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.575152 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcf65753-3087-4ead-a95d-c6e58ffa241b-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:25.682736 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.682698 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:49:25.692304 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:25.692265 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6677646mhmpb"] Apr 22 17:49:26.532882 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:26.532840 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" path="/var/lib/kubelet/pods/fcf65753-3087-4ead-a95d-c6e58ffa241b/volumes" Apr 22 17:49:27.357799 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:27.357767 2568 generic.go:358] "Generic (PLEG): container finished" podID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerID="d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e" exitCode=0 Apr 22 17:49:27.357994 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:27.357850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerDied","Data":"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e"} Apr 22 17:49:29.367983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:29.367926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerStarted","Data":"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d"} Apr 22 17:49:29.389952 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:29.389867 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" podStartSLOduration=6.269681308 podStartE2EDuration="7.389845292s" podCreationTimestamp="2026-04-22 17:49:22 +0000 UTC" firstStartedPulling="2026-04-22 17:49:27.359110462 +0000 UTC m=+923.340756389" lastFinishedPulling="2026-04-22 17:49:28.479274431 +0000 UTC m=+924.460920373" observedRunningTime="2026-04-22 17:49:29.388189267 +0000 UTC m=+925.369835215" watchObservedRunningTime="2026-04-22 17:49:29.389845292 +0000 UTC m=+925.371491242" Apr 22 17:49:32.442887 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.442838 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:32.442887 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.442894 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:32.455881 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.455843 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:32.712929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.712827 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:32.712929 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.712870 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:32.714165 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:49:32.714140 2568 logging.go:55] [core] [Channel #47 SubChannel #48]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 22 17:49:32.715515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:32.715496 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:33.381512 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:33.381480 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:33.391760 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:33.391730 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:33.713356 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:33.713244 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 22 17:49:42.713472 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:49:42.713434 2568 logging.go:55] [core] [Channel #49 SubChannel #50]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.29:9003", ServerName: "10.132.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.29:9003: connect: connection refused" Apr 22 17:49:43.713409 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:43.713356 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.29:9003\" within 1s: context deadline exceeded" Apr 22 17:49:54.385025 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:54.384996 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:55.714665 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.714630 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:55.715071 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.714948 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" containerID="cri-o://0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43" gracePeriod=30 Apr 22 17:49:55.715071 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.714998 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="tokenizer" containerID="cri-o://a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3" gracePeriod=30 Apr 22 17:49:55.724820 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.724788 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:55.725124 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.725081 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="main" containerID="cri-o://a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d" gracePeriod=30 Apr 22 17:49:55.978904 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:55.978880 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:56.025813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.025767 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.025854 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.025886 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvppd\" (UniqueName: \"kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.025926 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.026039 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache" (OuterVolumeSpecName: "model-cache") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.026214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.026060 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.026104 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home\") pod \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\" (UID: \"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2\") " Apr 22 17:49:56.026387 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.026336 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home" (OuterVolumeSpecName: "home") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.026387 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.026348 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-model-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.029374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.029095 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd" (OuterVolumeSpecName: "kube-api-access-bvppd") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "kube-api-access-bvppd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:56.029976 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.029929 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:56.030274 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.030242 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm" (OuterVolumeSpecName: "dshm") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.087205 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.087161 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" (UID: "e8f91946-bf44-4bb1-9ce5-dc4ef78686e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:56.127754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.127719 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-dshm\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.127754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.127750 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvppd\" (UniqueName: \"kubernetes.io/projected/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kube-api-access-bvppd\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.127754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.127761 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.128023 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.127771 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.128023 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.127781 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2-home\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:56.465986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.465933 2568 generic.go:358] "Generic (PLEG): container finished" podID="07c381c8-3f2e-4287-a861-45fd3317b676" containerID="0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43" exitCode=0 Apr 22 17:49:56.465986 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.465972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerDied","Data":"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43"} Apr 22 17:49:56.467414 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.467379 2568 generic.go:358] "Generic (PLEG): container finished" podID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerID="a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d" exitCode=0 Apr 22 17:49:56.467548 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.467433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerDied","Data":"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d"} Apr 22 17:49:56.467548 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.467456 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" event={"ID":"e8f91946-bf44-4bb1-9ce5-dc4ef78686e2","Type":"ContainerDied","Data":"eecc6262e12e0cba32daf56b47fdee007e43a934a854ca5e1779669a1f9c351c"} Apr 22 17:49:56.467548 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.467483 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd" Apr 22 17:49:56.467666 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.467486 2568 scope.go:117] "RemoveContainer" containerID="a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d" Apr 22 17:49:56.476634 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.476614 2568 scope.go:117] "RemoveContainer" containerID="d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e" Apr 22 17:49:56.492102 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.492077 2568 scope.go:117] "RemoveContainer" containerID="a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d" Apr 22 17:49:56.492429 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:56.492411 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d\": container with ID starting with a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d not found: ID does not exist" containerID="a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d" Apr 22 17:49:56.492505 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.492442 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d"} err="failed to get container status \"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d\": rpc error: code = NotFound desc = could not find container \"a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d\": container with ID starting with a55b45d0ea0590c86959c1ba07dc71f0e4461d7adf4a821ce007d73a09f0bf3d not found: ID does not exist" Apr 22 17:49:56.492505 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.492468 2568 scope.go:117] "RemoveContainer" containerID="d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e" Apr 22 17:49:56.492728 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:56.492700 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e\": container with ID starting with d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e not found: ID does not exist" containerID="d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e" Apr 22 17:49:56.492770 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.492740 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e"} err="failed to get container status \"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e\": rpc error: code = NotFound desc = could not find container \"d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e\": container with ID starting with d40a59d8da8c4c92842bffffd84d2f354525c072d0db57e904e13b24e312c77e not found: ID does not exist" Apr 22 17:49:56.496430 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:56.496403 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f91946_bf44_4bb1_9ce5_dc4ef78686e2.slice\": RecentStats: unable to find data in memory cache]" Apr 22 17:49:56.497081 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.497062 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:56.501797 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.501775 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-5b9bb998b4-9t4fd"] Apr 22 17:49:56.533403 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:56.533372 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" path="/var/lib/kubelet/pods/e8f91946-bf44-4bb1-9ce5-dc4ef78686e2/volumes" Apr 22 17:49:57.262632 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.262603 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:57.335801 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335813 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335848 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335886 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335933 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336020 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.335991 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzl9m\" (UniqueName: \"kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m\") pod \"07c381c8-3f2e-4287-a861-45fd3317b676\" (UID: \"07c381c8-3f2e-4287-a861-45fd3317b676\") " Apr 22 17:49:57.336280 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.336204 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:57.336280 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.336255 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:57.336390 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.336262 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:57.336646 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.336622 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:49:57.338115 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.338091 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:49:57.338197 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.338145 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m" (OuterVolumeSpecName: "kube-api-access-pzl9m") pod "07c381c8-3f2e-4287-a861-45fd3317b676" (UID: "07c381c8-3f2e-4287-a861-45fd3317b676"). InnerVolumeSpecName "kube-api-access-pzl9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:49:57.437560 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437520 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.437560 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437549 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.437560 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437559 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.437560 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437568 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07c381c8-3f2e-4287-a861-45fd3317b676-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.437808 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437577 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07c381c8-3f2e-4287-a861-45fd3317b676-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.437808 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.437586 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzl9m\" (UniqueName: \"kubernetes.io/projected/07c381c8-3f2e-4287-a861-45fd3317b676-kube-api-access-pzl9m\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:49:57.473204 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.473172 2568 generic.go:358] "Generic (PLEG): container finished" podID="07c381c8-3f2e-4287-a861-45fd3317b676" containerID="a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3" exitCode=0 Apr 22 17:49:57.473375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.473241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerDied","Data":"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3"} Apr 22 17:49:57.473375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.473281 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" event={"ID":"07c381c8-3f2e-4287-a861-45fd3317b676","Type":"ContainerDied","Data":"e330299c5ae48b945bec8a2515ce2dde83025e917d6d2c0f6d0555574edc0196"} Apr 22 17:49:57.473375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.473301 2568 scope.go:117] "RemoveContainer" containerID="a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3" Apr 22 17:49:57.473375 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.473255 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf" Apr 22 17:49:57.481243 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.481221 2568 scope.go:117] "RemoveContainer" containerID="0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43" Apr 22 17:49:57.488482 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.488462 2568 scope.go:117] "RemoveContainer" containerID="1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b" Apr 22 17:49:57.495630 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.495602 2568 scope.go:117] "RemoveContainer" containerID="a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3" Apr 22 17:49:57.495951 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:57.495917 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3\": container with ID starting with a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3 not found: ID does not exist" containerID="a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3" Apr 22 17:49:57.496037 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.495963 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3"} err="failed to get container status \"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3\": rpc error: code = NotFound desc = could not find container \"a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3\": container with ID starting with a61a077eeb7ea6acfc02947a9b284813b7b14fbbc648d5585835fd259cc740e3 not found: ID does not exist" Apr 22 17:49:57.496037 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.495992 2568 scope.go:117] "RemoveContainer" containerID="0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43" Apr 22 17:49:57.496281 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:57.496259 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43\": container with ID starting with 0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43 not found: ID does not exist" containerID="0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43" Apr 22 17:49:57.496352 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.496287 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43"} err="failed to get container status \"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43\": rpc error: code = NotFound desc = could not find container \"0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43\": container with ID starting with 0037218a5d2ada3984f0c5f397b05cad9c116e470ed75709d09338166709bf43 not found: ID does not exist" Apr 22 17:49:57.496352 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.496304 2568 scope.go:117] "RemoveContainer" containerID="1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b" Apr 22 17:49:57.496577 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:49:57.496556 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b\": container with ID starting with 1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b not found: ID does not exist" containerID="1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b" Apr 22 17:49:57.496652 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.496583 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b"} err="failed to get container status \"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b\": rpc error: code = NotFound desc = could not find container \"1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b\": container with ID starting with 1fd60ee2fb38a9f1baa1970cc732dc70e7dc5199b7e528bd5bcbff34bf955b1b not found: ID does not exist" Apr 22 17:49:57.497860 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.497836 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:57.504330 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:57.504300 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-59ccf6cd9lvvf"] Apr 22 17:49:58.532485 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:49:58.532451 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" path="/var/lib/kubelet/pods/07c381c8-3f2e-4287-a861-45fd3317b676/volumes" Apr 22 17:50:12.349412 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349379 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349679 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349692 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349703 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349709 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349716 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349722 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349729 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349735 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349742 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349747 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="storage-initializer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349760 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349766 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349775 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349782 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349789 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349794 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349844 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="main" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349855 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349863 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcf65753-3087-4ead-a95d-c6e58ffa241b" containerName="tokenizer" Apr 22 17:50:12.349861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349870 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="07c381c8-3f2e-4287-a861-45fd3317b676" containerName="main" Apr 22 17:50:12.350562 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.349876 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8f91946-bf44-4bb1-9ce5-dc4ef78686e2" containerName="main" Apr 22 17:50:12.354139 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.354119 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.357657 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.357636 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-f2vpx\"" Apr 22 17:50:12.357828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.357636 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:50:12.357828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.357684 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 17:50:12.371846 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.371816 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:50:12.457597 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.457597 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.457865 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867qt\" (UniqueName: \"kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.457865 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.457865 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.458013 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.457884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558507 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-867qt\" (UniqueName: \"kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558684 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558684 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558774 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558834 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.558834 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.558816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.559141 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.559118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.559237 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.559139 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.559237 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.559167 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.559237 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.559209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.561226 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.561204 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.570540 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.570513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-867qt\" (UniqueName: \"kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.663747 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.663653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:12.797226 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:12.797051 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:50:12.800023 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:50:12.799990 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78b8142_bec4_4a09_866c_c2af514c77c8.slice/crio-56322fc12344ca338b1ba5da4d854a868d2e85a3d54dc4d54354081e9e1df017 WatchSource:0}: Error finding container 56322fc12344ca338b1ba5da4d854a868d2e85a3d54dc4d54354081e9e1df017: Status 404 returned error can't find the container with id 56322fc12344ca338b1ba5da4d854a868d2e85a3d54dc4d54354081e9e1df017 Apr 22 17:50:13.535021 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:13.534927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerStarted","Data":"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b"} Apr 22 17:50:13.535389 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:13.535029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerStarted","Data":"56322fc12344ca338b1ba5da4d854a868d2e85a3d54dc4d54354081e9e1df017"} Apr 22 17:50:14.539175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:14.539140 2568 generic.go:358] "Generic (PLEG): container finished" podID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerID="6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b" exitCode=0 Apr 22 17:50:14.539561 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:14.539183 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerDied","Data":"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b"} Apr 22 17:50:15.544343 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:15.544307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerStarted","Data":"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576"} Apr 22 17:50:15.544343 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:15.544345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerStarted","Data":"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a"} Apr 22 17:50:15.544745 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:15.544436 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:15.568573 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:15.568525 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" podStartSLOduration=3.568510616 podStartE2EDuration="3.568510616s" podCreationTimestamp="2026-04-22 17:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:50:15.567175175 +0000 UTC m=+971.548821124" watchObservedRunningTime="2026-04-22 17:50:15.568510616 +0000 UTC m=+971.550156566" Apr 22 17:50:22.663985 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:22.663920 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:22.664414 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:22.663996 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:22.666752 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:22.666730 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:23.577816 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:23.577738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:50:45.584203 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:50:45.584176 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:53:04.753175 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:04.753142 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:53:04.753739 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:04.753547 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="main" containerID="cri-o://ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a" gracePeriod=30 Apr 22 17:53:04.753739 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:04.753598 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="tokenizer" containerID="cri-o://40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576" gracePeriod=30 Apr 22 17:53:05.105022 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:05.104977 2568 generic.go:358] "Generic (PLEG): container finished" podID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerID="ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a" exitCode=0 Apr 22 17:53:05.105198 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:05.105053 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerDied","Data":"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a"} Apr 22 17:53:05.583055 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:53:05.583024 2568 logging.go:55] [core] [Channel #150 SubChannel #151]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 22 17:53:06.104095 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.104069 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:53:06.109086 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.109002 2568 generic.go:358] "Generic (PLEG): container finished" podID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerID="40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576" exitCode=0 Apr 22 17:53:06.109086 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.109065 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" Apr 22 17:53:06.109283 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.109062 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerDied","Data":"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576"} Apr 22 17:53:06.109283 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.109179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" event={"ID":"a78b8142-bec4-4a09-866c-c2af514c77c8","Type":"ContainerDied","Data":"56322fc12344ca338b1ba5da4d854a868d2e85a3d54dc4d54354081e9e1df017"} Apr 22 17:53:06.109283 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.109202 2568 scope.go:117] "RemoveContainer" containerID="40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576" Apr 22 17:53:06.116701 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.116682 2568 scope.go:117] "RemoveContainer" containerID="ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a" Apr 22 17:53:06.123772 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.123752 2568 scope.go:117] "RemoveContainer" containerID="6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b" Apr 22 17:53:06.133214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.133193 2568 scope.go:117] "RemoveContainer" containerID="40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576" Apr 22 17:53:06.133509 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:53:06.133490 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576\": container with ID starting with 40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576 not found: ID does not exist" containerID="40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576" Apr 22 17:53:06.133556 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.133520 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576"} err="failed to get container status \"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576\": rpc error: code = NotFound desc = could not find container \"40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576\": container with ID starting with 40a01c45c52fecba8572a58ea266ae03b882517faf5548389edb038004dfb576 not found: ID does not exist" Apr 22 17:53:06.133556 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.133539 2568 scope.go:117] "RemoveContainer" containerID="ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a" Apr 22 17:53:06.133808 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:53:06.133781 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a\": container with ID starting with ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a not found: ID does not exist" containerID="ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a" Apr 22 17:53:06.133877 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.133818 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a"} err="failed to get container status \"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a\": rpc error: code = NotFound desc = could not find container \"ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a\": container with ID starting with ce4ba90670e1f077f4e35367badfb6db3b1c83c3e221bcd9dc613c69e7ef045a not found: ID does not exist" Apr 22 17:53:06.133877 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.133842 2568 scope.go:117] "RemoveContainer" containerID="6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b" Apr 22 17:53:06.134101 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:53:06.134085 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b\": container with ID starting with 6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b not found: ID does not exist" containerID="6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b" Apr 22 17:53:06.134152 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.134105 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b"} err="failed to get container status \"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b\": rpc error: code = NotFound desc = could not find container \"6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b\": container with ID starting with 6d09098c93404c3a69b304cdde1fab48330939fffc1ddbcfbab9caec4310584b not found: ID does not exist" Apr 22 17:53:06.164641 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164608 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.164826 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164684 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-867qt\" (UniqueName: \"kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.164826 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164717 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.164826 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164744 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.164826 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164776 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.165070 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.164927 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:06.165252 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.165218 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:06.165519 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.165494 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:06.166875 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.166853 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:53:06.166985 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.166921 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt" (OuterVolumeSpecName: "kube-api-access-867qt") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "kube-api-access-867qt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:53:06.266050 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266012 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds\") pod \"a78b8142-bec4-4a09-866c-c2af514c77c8\" (UID: \"a78b8142-bec4-4a09-866c-c2af514c77c8\") " Apr 22 17:53:06.266218 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266151 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.266218 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266163 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-867qt\" (UniqueName: \"kubernetes.io/projected/a78b8142-bec4-4a09-866c-c2af514c77c8-kube-api-access-867qt\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.266218 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266173 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.266218 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266182 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a78b8142-bec4-4a09-866c-c2af514c77c8-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.266218 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266191 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.266380 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.266274 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a78b8142-bec4-4a09-866c-c2af514c77c8" (UID: "a78b8142-bec4-4a09-866c-c2af514c77c8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:53:06.366583 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.366498 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a78b8142-bec4-4a09-866c-c2af514c77c8-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:53:06.430086 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.430056 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:53:06.432802 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.432774 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v"] Apr 22 17:53:06.533101 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.533064 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" path="/var/lib/kubelet/pods/a78b8142-bec4-4a09-866c-c2af514c77c8/volumes" Apr 22 17:53:06.583214 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:06.583166 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-src2v" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 22 17:53:13.524536 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.524497 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.524992 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="storage-initializer" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525011 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="storage-initializer" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525039 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="main" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525049 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="main" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525075 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="tokenizer" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525084 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="tokenizer" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525161 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="tokenizer" Apr 22 17:53:13.525170 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.525174 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a78b8142-bec4-4a09-866c-c2af514c77c8" containerName="main" Apr 22 17:53:13.532219 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.532187 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.534832 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.534805 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:53:13.535483 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.535464 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 17:53:13.535483 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.535479 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-wfcnn\"" Apr 22 17:53:13.542552 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.542528 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:53:13.626778 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.626996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.626996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.626996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626883 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.626996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.626996 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.626952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4q27\" (UniqueName: \"kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.727730 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.727693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.727934 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.727737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.727934 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.727870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4q27\" (UniqueName: \"kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728082 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728019 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728143 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728143 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728143 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728333 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728370 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.728407 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.728387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.730374 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.730348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.737622 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.737582 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4q27\" (UniqueName: \"kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27\") pod \"stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.842780 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.842736 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:13.976714 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.976679 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:53:13.980235 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:53:13.980204 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c66277_bf1d_49e6_b28c_bef381d68639.slice/crio-d3273d8595d581efc9ab74a8572131b48259baef6b22e70e1b41a6fc9d60ba30 WatchSource:0}: Error finding container d3273d8595d581efc9ab74a8572131b48259baef6b22e70e1b41a6fc9d60ba30: Status 404 returned error can't find the container with id d3273d8595d581efc9ab74a8572131b48259baef6b22e70e1b41a6fc9d60ba30 Apr 22 17:53:13.982062 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:13.982044 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:14.139317 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:14.139219 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerStarted","Data":"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b"} Apr 22 17:53:14.139317 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:14.139268 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerStarted","Data":"d3273d8595d581efc9ab74a8572131b48259baef6b22e70e1b41a6fc9d60ba30"} Apr 22 17:53:15.143994 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:15.143956 2568 generic.go:358] "Generic (PLEG): container finished" podID="11c66277-bf1d-49e6-b28c-bef381d68639" containerID="5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b" exitCode=0 Apr 22 17:53:15.144366 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:15.144046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerDied","Data":"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b"} Apr 22 17:53:16.149521 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:16.149482 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerStarted","Data":"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9"} Apr 22 17:53:16.149932 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:16.149528 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerStarted","Data":"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc"} Apr 22 17:53:16.149932 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:16.149597 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:16.172275 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:16.172219 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" podStartSLOduration=3.172203669 podStartE2EDuration="3.172203669s" podCreationTimestamp="2026-04-22 17:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:16.170037639 +0000 UTC m=+1152.151683589" watchObservedRunningTime="2026-04-22 17:53:16.172203669 +0000 UTC m=+1152.153849663" Apr 22 17:53:23.843209 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:23.843099 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:23.843209 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:23.843162 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:23.845847 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:23.845825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:24.178791 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:24.178696 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:53:45.182349 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:53:45.182319 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:54:04.562307 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:54:04.562279 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:54:04.563236 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:54:04.563213 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:55:40.899641 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:40.899603 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:55:40.900192 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:40.900019 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="main" containerID="cri-o://18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc" gracePeriod=30 Apr 22 17:55:40.900192 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:40.900056 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="tokenizer" containerID="cri-o://510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9" gracePeriod=30 Apr 22 17:55:41.631805 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:41.631772 2568 generic.go:358] "Generic (PLEG): container finished" podID="11c66277-bf1d-49e6-b28c-bef381d68639" containerID="18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc" exitCode=0 Apr 22 17:55:41.632006 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:41.631850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerDied","Data":"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc"} Apr 22 17:55:42.137805 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.137783 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:55:42.249477 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249404 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4q27\" (UniqueName: \"kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249477 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249447 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249653 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249509 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249653 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249527 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249653 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249555 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249653 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249595 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache\") pod \"11c66277-bf1d-49e6-b28c-bef381d68639\" (UID: \"11c66277-bf1d-49e6-b28c-bef381d68639\") " Apr 22 17:55:42.249868 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249847 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:42.249932 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.249882 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:42.250036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.250012 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:42.250329 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.250309 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:42.251546 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.251528 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27" (OuterVolumeSpecName: "kube-api-access-j4q27") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "kube-api-access-j4q27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:42.251627 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.251611 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "11c66277-bf1d-49e6-b28c-bef381d68639" (UID: "11c66277-bf1d-49e6-b28c-bef381d68639"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:42.350434 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350401 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.350434 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350427 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11c66277-bf1d-49e6-b28c-bef381d68639-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.350434 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350438 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.350626 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350446 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.350626 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350455 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4q27\" (UniqueName: \"kubernetes.io/projected/11c66277-bf1d-49e6-b28c-bef381d68639-kube-api-access-j4q27\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.350626 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.350465 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11c66277-bf1d-49e6-b28c-bef381d68639-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 17:55:42.637797 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.637768 2568 generic.go:358] "Generic (PLEG): container finished" podID="11c66277-bf1d-49e6-b28c-bef381d68639" containerID="510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9" exitCode=0 Apr 22 17:55:42.637983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.637834 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" Apr 22 17:55:42.637983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.637852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerDied","Data":"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9"} Apr 22 17:55:42.637983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.637896 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f" event={"ID":"11c66277-bf1d-49e6-b28c-bef381d68639","Type":"ContainerDied","Data":"d3273d8595d581efc9ab74a8572131b48259baef6b22e70e1b41a6fc9d60ba30"} Apr 22 17:55:42.637983 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.637920 2568 scope.go:117] "RemoveContainer" containerID="510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9" Apr 22 17:55:42.647689 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.647671 2568 scope.go:117] "RemoveContainer" containerID="18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc" Apr 22 17:55:42.654417 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.654402 2568 scope.go:117] "RemoveContainer" containerID="5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b" Apr 22 17:55:42.661121 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661105 2568 scope.go:117] "RemoveContainer" containerID="510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9" Apr 22 17:55:42.661355 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:55:42.661336 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9\": container with ID starting with 510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9 not found: ID does not exist" containerID="510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9" Apr 22 17:55:42.661413 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661361 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9"} err="failed to get container status \"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9\": rpc error: code = NotFound desc = could not find container \"510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9\": container with ID starting with 510c2a704eeeff8d9680e177de8eaf3054832bf8cefb0c23f33cce64e2e6a4a9 not found: ID does not exist" Apr 22 17:55:42.661413 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661376 2568 scope.go:117] "RemoveContainer" containerID="18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc" Apr 22 17:55:42.661589 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:55:42.661570 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc\": container with ID starting with 18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc not found: ID does not exist" containerID="18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc" Apr 22 17:55:42.661627 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661604 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc"} err="failed to get container status \"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc\": rpc error: code = NotFound desc = could not find container \"18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc\": container with ID starting with 18fd70d8cdf451c069fddcb7524e887f9d64c4a19ae4eb9af0f6fcb4a1c3d0bc not found: ID does not exist" Apr 22 17:55:42.661627 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661622 2568 scope.go:117] "RemoveContainer" containerID="5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b" Apr 22 17:55:42.661855 ip-10-0-133-169 kubenswrapper[2568]: E0422 17:55:42.661831 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b\": container with ID starting with 5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b not found: ID does not exist" containerID="5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b" Apr 22 17:55:42.661909 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.661860 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b"} err="failed to get container status \"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b\": rpc error: code = NotFound desc = could not find container \"5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b\": container with ID starting with 5fb49e1581ac1f8ace6f7b9424ebdb51d64fe103dbf9bce0cfee47834eeb631b not found: ID does not exist" Apr 22 17:55:42.677967 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.675055 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:55:42.694508 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:42.694481 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-86d4676f8f-9t48f"] Apr 22 17:55:43.302654 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302578 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-869c67c9d6-pd2z7"] Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302888 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="main" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302901 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="main" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302919 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="tokenizer" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302927 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="tokenizer" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302956 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="storage-initializer" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.302965 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="storage-initializer" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.303038 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="main" Apr 22 17:55:43.303034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.303053 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" containerName="tokenizer" Apr 22 17:55:43.307546 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.307528 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.310731 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.310682 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 17:55:43.311139 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.310734 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-rwfsm\"" Apr 22 17:55:43.313149 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.313123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-869c67c9d6-pd2z7"] Apr 22 17:55:43.459344 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.459314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/251aa3df-ddac-4db4-ab20-cfcee419d68a-cert\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.459344 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.459345 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hzd\" (UniqueName: \"kubernetes.io/projected/251aa3df-ddac-4db4-ab20-cfcee419d68a-kube-api-access-99hzd\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.560186 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.560106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/251aa3df-ddac-4db4-ab20-cfcee419d68a-cert\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.560186 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.560137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99hzd\" (UniqueName: \"kubernetes.io/projected/251aa3df-ddac-4db4-ab20-cfcee419d68a-kube-api-access-99hzd\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.562408 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.562389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/251aa3df-ddac-4db4-ab20-cfcee419d68a-cert\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.569036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.569014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hzd\" (UniqueName: \"kubernetes.io/projected/251aa3df-ddac-4db4-ab20-cfcee419d68a-kube-api-access-99hzd\") pod \"llmisvc-controller-manager-869c67c9d6-pd2z7\" (UID: \"251aa3df-ddac-4db4-ab20-cfcee419d68a\") " pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.618045 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.618019 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:43.733817 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:43.733782 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-869c67c9d6-pd2z7"] Apr 22 17:55:43.737583 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:55:43.737551 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod251aa3df_ddac_4db4_ab20_cfcee419d68a.slice/crio-c80e8cb16e0b7464e9d05611b90196ba29cb78610734a294e109ec3016b6e022 WatchSource:0}: Error finding container c80e8cb16e0b7464e9d05611b90196ba29cb78610734a294e109ec3016b6e022: Status 404 returned error can't find the container with id c80e8cb16e0b7464e9d05611b90196ba29cb78610734a294e109ec3016b6e022 Apr 22 17:55:44.535206 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:44.535177 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c66277-bf1d-49e6-b28c-bef381d68639" path="/var/lib/kubelet/pods/11c66277-bf1d-49e6-b28c-bef381d68639/volumes" Apr 22 17:55:44.646019 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:44.645989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" event={"ID":"251aa3df-ddac-4db4-ab20-cfcee419d68a","Type":"ContainerStarted","Data":"c80e8cb16e0b7464e9d05611b90196ba29cb78610734a294e109ec3016b6e022"} Apr 22 17:55:47.657748 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:47.657714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" event={"ID":"251aa3df-ddac-4db4-ab20-cfcee419d68a","Type":"ContainerStarted","Data":"710fa144a3fdf15acac960cf246c465f9575cd47482e80b7d674f4445877b3fc"} Apr 22 17:55:47.658156 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:47.657875 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:55:47.676096 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:55:47.676045 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" podStartSLOduration=1.5028784769999999 podStartE2EDuration="4.676010738s" podCreationTimestamp="2026-04-22 17:55:43 +0000 UTC" firstStartedPulling="2026-04-22 17:55:43.738862678 +0000 UTC m=+1299.720508605" lastFinishedPulling="2026-04-22 17:55:46.911994932 +0000 UTC m=+1302.893640866" observedRunningTime="2026-04-22 17:55:47.674153125 +0000 UTC m=+1303.655799074" watchObservedRunningTime="2026-04-22 17:55:47.676010738 +0000 UTC m=+1303.657656687" Apr 22 17:56:18.663193 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:18.663159 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-869c67c9d6-pd2z7" Apr 22 17:56:49.865750 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.865715 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg"] Apr 22 17:56:49.869349 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.869325 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.871824 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.871798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-jddvt\"" Apr 22 17:56:49.890764 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.890728 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg"] Apr 22 17:56:49.999061 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999364 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999421 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999515 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9r7\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-kube-api-access-rs9r7\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999734 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999734 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:49.999734 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:49.999661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100092 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100250 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100250 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100250 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100250 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9r7\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-kube-api-access-rs9r7\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100431 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100251 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100431 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100431 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100431 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100649 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100598 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.100828 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.101034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.100838 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.101034 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.101027 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.102547 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.102528 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.102811 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.102791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.109813 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.109767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9r7\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-kube-api-access-rs9r7\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.110637 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.110610 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cdf9d78-5d64-4593-9cf9-d20f92e8d70d-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-5qjtg\" (UID: \"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.181811 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.181726 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:50.327861 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.327833 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg"] Apr 22 17:56:50.330266 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:56:50.330225 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdf9d78_5d64_4593_9cf9_d20f92e8d70d.slice/crio-fd88203b15bb2bc32606ebadb9ebd3a934bb5494aaaf86faab023baf7701488b WatchSource:0}: Error finding container fd88203b15bb2bc32606ebadb9ebd3a934bb5494aaaf86faab023baf7701488b: Status 404 returned error can't find the container with id fd88203b15bb2bc32606ebadb9ebd3a934bb5494aaaf86faab023baf7701488b Apr 22 17:56:50.332367 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.332337 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:56:50.332458 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.332398 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:56:50.332458 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.332439 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 17:56:50.864264 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.864226 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" event={"ID":"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d","Type":"ContainerStarted","Data":"de81153d1fb69d99c0315f6bcd9829991be47b9b079a3f538de060a483336c63"} Apr 22 17:56:50.864264 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.864266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" event={"ID":"0cdf9d78-5d64-4593-9cf9-d20f92e8d70d","Type":"ContainerStarted","Data":"fd88203b15bb2bc32606ebadb9ebd3a934bb5494aaaf86faab023baf7701488b"} Apr 22 17:56:50.887685 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:50.887631 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" podStartSLOduration=1.887616883 podStartE2EDuration="1.887616883s" podCreationTimestamp="2026-04-22 17:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:50.885499094 +0000 UTC m=+1366.867145044" watchObservedRunningTime="2026-04-22 17:56:50.887616883 +0000 UTC m=+1366.869262832" Apr 22 17:56:51.182820 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:51.182744 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:51.187494 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:51.187467 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:51.868166 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:51.868133 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:56:51.869267 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:56:51.869241 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-5qjtg" Apr 22 17:57:08.157534 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.157502 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 17:57:08.161036 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.161010 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.163610 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.163585 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 17:57:08.163721 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.163700 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-ncfgz\"" Apr 22 17:57:08.164485 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.164468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 17:57:08.171792 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.171768 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 17:57:08.256685 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.256848 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.256848 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.256848 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.256992 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.256992 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.256955 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvc8\" (UniqueName: \"kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358427 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvc8\" (UniqueName: \"kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358463 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358530 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358612 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358902 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.358981 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.358934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.359041 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.359002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.359041 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.359021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.361286 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.361260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.366485 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.366453 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvc8\" (UniqueName: \"kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8\") pod \"router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.470686 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.470598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:08.601576 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.601548 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 17:57:08.604168 ip-10-0-133-169 kubenswrapper[2568]: W0422 17:57:08.604139 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff816ae3_c818_4999_998c_1330a2c28569.slice/crio-7eb6215c60f106fbfe3be3c01e2b59834bfa07106f75bba8ca98676cf66097cf WatchSource:0}: Error finding container 7eb6215c60f106fbfe3be3c01e2b59834bfa07106f75bba8ca98676cf66097cf: Status 404 returned error can't find the container with id 7eb6215c60f106fbfe3be3c01e2b59834bfa07106f75bba8ca98676cf66097cf Apr 22 17:57:08.927754 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.927717 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerStarted","Data":"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718"} Apr 22 17:57:08.927925 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:08.927760 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerStarted","Data":"7eb6215c60f106fbfe3be3c01e2b59834bfa07106f75bba8ca98676cf66097cf"} Apr 22 17:57:09.933336 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:09.933294 2568 generic.go:358] "Generic (PLEG): container finished" podID="ff816ae3-c818-4999-998c-1330a2c28569" containerID="422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718" exitCode=0 Apr 22 17:57:09.933708 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:09.933381 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerDied","Data":"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718"} Apr 22 17:57:10.939313 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:10.939277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerStarted","Data":"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26"} Apr 22 17:57:10.939313 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:10.939316 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerStarted","Data":"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6"} Apr 22 17:57:10.939839 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:10.939403 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:10.964998 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:10.964951 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" podStartSLOduration=2.964920868 podStartE2EDuration="2.964920868s" podCreationTimestamp="2026-04-22 17:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:10.962317366 +0000 UTC m=+1386.943963316" watchObservedRunningTime="2026-04-22 17:57:10.964920868 +0000 UTC m=+1386.946566816" Apr 22 17:57:18.471322 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:18.471283 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:18.471322 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:18.471331 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:18.473987 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:18.473961 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:18.968603 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:18.968575 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:57:39.972297 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:57:39.972268 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 17:59:04.583289 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:59:04.583255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 17:59:04.585075 ip-10-0-133-169 kubenswrapper[2568]: I0422 17:59:04.585053 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 18:00:48.002116 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.002078 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 18:00:48.003448 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.003384 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="main" containerID="cri-o://41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6" gracePeriod=30 Apr 22 18:00:48.003787 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.003704 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="tokenizer" containerID="cri-o://81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26" gracePeriod=30 Apr 22 18:00:48.674953 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.674916 2568 generic.go:358] "Generic (PLEG): container finished" podID="ff816ae3-c818-4999-998c-1330a2c28569" containerID="41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6" exitCode=0 Apr 22 18:00:48.675136 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.674999 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerDied","Data":"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6"} Apr 22 18:00:48.968634 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:48.968538 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.34:8082/healthz\": dial tcp 10.132.0.34:8082: connect: connection refused" Apr 22 18:00:49.280077 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.280054 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 18:00:49.389495 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389457 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389693 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389507 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389693 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389557 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvc8\" (UniqueName: \"kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389693 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389575 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389693 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389601 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389693 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389636 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache\") pod \"ff816ae3-c818-4999-998c-1330a2c28569\" (UID: \"ff816ae3-c818-4999-998c-1330a2c28569\") " Apr 22 18:00:49.389996 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.389886 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:49.390061 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.390034 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:49.390061 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.390037 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:49.390385 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.390362 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:00:49.391740 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.391720 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:49.391740 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.391726 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8" (OuterVolumeSpecName: "kube-api-access-sfvc8") pod "ff816ae3-c818-4999-998c-1330a2c28569" (UID: "ff816ae3-c818-4999-998c-1330a2c28569"). InnerVolumeSpecName "kube-api-access-sfvc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:00:49.490609 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490576 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfvc8\" (UniqueName: \"kubernetes.io/projected/ff816ae3-c818-4999-998c-1330a2c28569-kube-api-access-sfvc8\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.490609 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490604 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.490609 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490614 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.490839 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490623 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.490839 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490637 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff816ae3-c818-4999-998c-1330a2c28569-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.490839 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.490649 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff816ae3-c818-4999-998c-1330a2c28569-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:00:49.680201 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.680166 2568 generic.go:358] "Generic (PLEG): container finished" podID="ff816ae3-c818-4999-998c-1330a2c28569" containerID="81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26" exitCode=0 Apr 22 18:00:49.680391 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.680222 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerDied","Data":"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26"} Apr 22 18:00:49.680391 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.680249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" event={"ID":"ff816ae3-c818-4999-998c-1330a2c28569","Type":"ContainerDied","Data":"7eb6215c60f106fbfe3be3c01e2b59834bfa07106f75bba8ca98676cf66097cf"} Apr 22 18:00:49.680391 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.680252 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj" Apr 22 18:00:49.680391 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.680264 2568 scope.go:117] "RemoveContainer" containerID="81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26" Apr 22 18:00:49.689199 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.689019 2568 scope.go:117] "RemoveContainer" containerID="41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6" Apr 22 18:00:49.696375 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.696355 2568 scope.go:117] "RemoveContainer" containerID="422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718" Apr 22 18:00:49.703125 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.703100 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 18:00:49.703830 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.703818 2568 scope.go:117] "RemoveContainer" containerID="81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26" Apr 22 18:00:49.704202 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:00:49.704181 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26\": container with ID starting with 81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26 not found: ID does not exist" containerID="81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26" Apr 22 18:00:49.704290 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.704212 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26"} err="failed to get container status \"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26\": rpc error: code = NotFound desc = could not find container \"81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26\": container with ID starting with 81edf41fa03d883bcd1d9baf7d63a8fc0dd4d0f44ec3262719ac92ba56d94e26 not found: ID does not exist" Apr 22 18:00:49.704290 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.704232 2568 scope.go:117] "RemoveContainer" containerID="41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6" Apr 22 18:00:49.704470 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:00:49.704450 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6\": container with ID starting with 41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6 not found: ID does not exist" containerID="41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6" Apr 22 18:00:49.704507 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.704477 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6"} err="failed to get container status \"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6\": rpc error: code = NotFound desc = could not find container \"41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6\": container with ID starting with 41fab182b93d3a391ceca21618ef296e7d9f8aa90b25beb6c38435ec011459e6 not found: ID does not exist" Apr 22 18:00:49.704507 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.704492 2568 scope.go:117] "RemoveContainer" containerID="422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718" Apr 22 18:00:49.704709 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:00:49.704691 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718\": container with ID starting with 422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718 not found: ID does not exist" containerID="422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718" Apr 22 18:00:49.704746 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.704716 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718"} err="failed to get container status \"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718\": rpc error: code = NotFound desc = could not find container \"422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718\": container with ID starting with 422a73dcd0f4ccba7909014f02b9dc9b29504e1ce17b1fbf58dfa36fb1690718 not found: ID does not exist" Apr 22 18:00:49.707859 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:49.707834 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-cbd47f9747dmcj"] Apr 22 18:00:50.532719 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:00:50.532681 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff816ae3-c818-4999-998c-1330a2c28569" path="/var/lib/kubelet/pods/ff816ae3-c818-4999-998c-1330a2c28569/volumes" Apr 22 18:04:04.603717 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:04.603680 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 18:04:04.607020 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:04.606994 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 18:04:18.538161 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538128 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538427 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="tokenizer" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538439 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="tokenizer" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538448 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="storage-initializer" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538455 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="storage-initializer" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538472 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="main" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538478 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="main" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538534 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="tokenizer" Apr 22 18:04:18.538543 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.538544 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff816ae3-c818-4999-998c-1330a2c28569" containerName="main" Apr 22 18:04:18.541628 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.541605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.544818 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.544793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 18:04:18.544980 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.544856 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ts22s\"" Apr 22 18:04:18.544980 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.544803 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-mmvmj\"" Apr 22 18:04:18.554202 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.554173 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:04:18.623270 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.623270 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjhgf\" (UniqueName: \"kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.623490 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.623490 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.623572 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623552 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.623615 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.623599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724350 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724350 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724616 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724616 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724616 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724616 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjhgf\" (UniqueName: \"kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724837 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724885 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.724927 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.725004 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.724984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.727078 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.727059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.743819 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.743790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjhgf\" (UniqueName: \"kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.855897 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.855860 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:18.983621 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.983538 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:04:18.986011 ip-10-0-133-169 kubenswrapper[2568]: W0422 18:04:18.985983 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865953d6_0531_4f68_a79e_af9d61e7b398.slice/crio-faac236789022971324cf0bacac826f1bc0899cd0f8f87e2c0abf6bc8946ac99 WatchSource:0}: Error finding container faac236789022971324cf0bacac826f1bc0899cd0f8f87e2c0abf6bc8946ac99: Status 404 returned error can't find the container with id faac236789022971324cf0bacac826f1bc0899cd0f8f87e2c0abf6bc8946ac99 Apr 22 18:04:18.987934 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:18.987916 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:04:19.373331 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:19.373291 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerStarted","Data":"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b"} Apr 22 18:04:19.373331 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:19.373332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerStarted","Data":"faac236789022971324cf0bacac826f1bc0899cd0f8f87e2c0abf6bc8946ac99"} Apr 22 18:04:20.378481 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:20.378441 2568 generic.go:358] "Generic (PLEG): container finished" podID="865953d6-0531-4f68-a79e-af9d61e7b398" containerID="f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b" exitCode=0 Apr 22 18:04:20.378971 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:20.378528 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerDied","Data":"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b"} Apr 22 18:04:21.384068 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:21.384017 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerStarted","Data":"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0"} Apr 22 18:04:21.384068 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:21.384073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerStarted","Data":"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197"} Apr 22 18:04:21.384565 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:21.384114 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:21.409959 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:21.407442 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" podStartSLOduration=3.4074209509999998 podStartE2EDuration="3.407420951s" podCreationTimestamp="2026-04-22 18:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:04:21.403380067 +0000 UTC m=+1817.385026015" watchObservedRunningTime="2026-04-22 18:04:21.407420951 +0000 UTC m=+1817.389066901" Apr 22 18:04:28.856476 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:28.856433 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:28.857018 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:28.856491 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:28.859167 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:28.859146 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:29.413145 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:29.413110 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:04:50.417486 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:04:50.417455 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:07:31.010436 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:31.010396 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:07:31.011513 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:31.011328 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="main" containerID="cri-o://39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197" gracePeriod=30 Apr 22 18:07:31.011513 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:31.011393 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="tokenizer" containerID="cri-o://485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0" gracePeriod=30 Apr 22 18:07:32.027066 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.027031 2568 generic.go:358] "Generic (PLEG): container finished" podID="865953d6-0531-4f68-a79e-af9d61e7b398" containerID="39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197" exitCode=0 Apr 22 18:07:32.027483 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.027109 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerDied","Data":"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197"} Apr 22 18:07:32.374735 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.374712 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:07:32.488620 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488581 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjhgf\" (UniqueName: \"kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.488810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488634 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.488810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488670 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.488810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488725 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.488810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488771 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.488810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.488807 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp\") pod \"865953d6-0531-4f68-a79e-af9d61e7b398\" (UID: \"865953d6-0531-4f68-a79e-af9d61e7b398\") " Apr 22 18:07:32.489117 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.489051 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.489117 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.489062 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.489293 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.489266 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.489541 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.489518 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:32.490780 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.490756 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf" (OuterVolumeSpecName: "kube-api-access-wjhgf") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "kube-api-access-wjhgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:32.490929 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.490907 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "865953d6-0531-4f68-a79e-af9d61e7b398" (UID: "865953d6-0531-4f68-a79e-af9d61e7b398"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:32.589506 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589477 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/865953d6-0531-4f68-a79e-af9d61e7b398-tls-certs\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.589506 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589507 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-kserve-provision-location\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.589690 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589517 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-tmp\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.589690 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589529 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wjhgf\" (UniqueName: \"kubernetes.io/projected/865953d6-0531-4f68-a79e-af9d61e7b398-kube-api-access-wjhgf\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.589690 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589543 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-cache\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:32.589690 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:32.589552 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/865953d6-0531-4f68-a79e-af9d61e7b398-tokenizer-uds\") on node \"ip-10-0-133-169.ec2.internal\" DevicePath \"\"" Apr 22 18:07:33.032372 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.032269 2568 generic.go:358] "Generic (PLEG): container finished" podID="865953d6-0531-4f68-a79e-af9d61e7b398" containerID="485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0" exitCode=0 Apr 22 18:07:33.032372 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.032351 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" Apr 22 18:07:33.032372 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.032358 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerDied","Data":"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0"} Apr 22 18:07:33.032860 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.032401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg" event={"ID":"865953d6-0531-4f68-a79e-af9d61e7b398","Type":"ContainerDied","Data":"faac236789022971324cf0bacac826f1bc0899cd0f8f87e2c0abf6bc8946ac99"} Apr 22 18:07:33.032860 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.032422 2568 scope.go:117] "RemoveContainer" containerID="485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0" Apr 22 18:07:33.040687 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.040663 2568 scope.go:117] "RemoveContainer" containerID="39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197" Apr 22 18:07:33.048985 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.048958 2568 scope.go:117] "RemoveContainer" containerID="f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b" Apr 22 18:07:33.050608 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.050581 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:07:33.054829 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.054797 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedq6kg"] Apr 22 18:07:33.057810 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.057790 2568 scope.go:117] "RemoveContainer" containerID="485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0" Apr 22 18:07:33.058206 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:07:33.058157 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0\": container with ID starting with 485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0 not found: ID does not exist" containerID="485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0" Apr 22 18:07:33.058309 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.058214 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0"} err="failed to get container status \"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0\": rpc error: code = NotFound desc = could not find container \"485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0\": container with ID starting with 485ce32d2a9ff8ff829b028fe0443ebfaa3292a567f73eca77ab5cc7a206e1e0 not found: ID does not exist" Apr 22 18:07:33.058309 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.058235 2568 scope.go:117] "RemoveContainer" containerID="39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197" Apr 22 18:07:33.058488 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:07:33.058469 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197\": container with ID starting with 39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197 not found: ID does not exist" containerID="39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197" Apr 22 18:07:33.058542 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.058497 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197"} err="failed to get container status \"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197\": rpc error: code = NotFound desc = could not find container \"39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197\": container with ID starting with 39160c63f1c8010ac3caa2a5558074195675ab635d2cd6f9a8646a61e5a9a197 not found: ID does not exist" Apr 22 18:07:33.058542 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.058515 2568 scope.go:117] "RemoveContainer" containerID="f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b" Apr 22 18:07:33.058752 ip-10-0-133-169 kubenswrapper[2568]: E0422 18:07:33.058734 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b\": container with ID starting with f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b not found: ID does not exist" containerID="f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b" Apr 22 18:07:33.058800 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:33.058757 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b"} err="failed to get container status \"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b\": rpc error: code = NotFound desc = could not find container \"f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b\": container with ID starting with f8d61fff842b00e9d3821c39f88b05cdfc5d4b6e36757f4a3d2b060891a3657b not found: ID does not exist" Apr 22 18:07:34.539461 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:07:34.539417 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" path="/var/lib/kubelet/pods/865953d6-0531-4f68-a79e-af9d61e7b398/volumes" Apr 22 18:08:18.899673 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:18.899642 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:18.928504 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:18.928473 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:19.832033 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:19.832000 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:19.847008 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:19.846979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:20.746692 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:20.746660 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:20.761350 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:20.761323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:21.685180 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:21.685146 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:21.701746 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:21.701720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:22.588065 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:22.588035 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:22.602415 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:22.602387 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:23.477377 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:23.477299 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:23.491764 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:23.491732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:24.388493 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:24.388466 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:24.402826 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:24.402803 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:25.298257 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:25.298228 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:25.313814 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:25.313788 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:26.188770 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:26.188737 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:26.202709 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:26.202680 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:27.079473 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:27.079440 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:27.096098 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:27.096075 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:28.011453 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:28.011424 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:28.028920 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:28.028892 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:28.925431 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:28.925399 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:28.943282 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:28.943255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:29.876507 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:29.876476 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:29.890753 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:29.890723 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:30.754076 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:30.754044 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-q8cf2_c7faed16-9523-4c8f-b29d-6c0555986b88/istio-proxy/0.log" Apr 22 18:08:30.767750 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:30.767721 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-5qjtg_0cdf9d78-5d64-4593-9cf9-d20f92e8d70d/istio-proxy/0.log" Apr 22 18:08:31.760307 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:31.760281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vtxh4_af5eb2e4-1b98-43d1-98e5-57c92f8736ee/discovery/0.log" Apr 22 18:08:32.568527 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:32.568495 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vtxh4_af5eb2e4-1b98-43d1-98e5-57c92f8736ee/discovery/0.log" Apr 22 18:08:33.338915 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:33.338885 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-dmcng_dc1b9d03-8eea-42c8-bf4c-980377435abd/manager/0.log" Apr 22 18:08:33.397115 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:33.397082 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-7cv4v_c78527fa-b7db-4dd8-909a-b848cca5ac08/manager/0.log" Apr 22 18:08:33.434359 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:33.434327 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rk74b_698e6b0e-c4d5-4d1f-a178-962254f68e1c/manager/0.log" Apr 22 18:08:38.672465 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:38.672425 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zrp9s_b8054fdb-27d5-4a8f-9c9c-dd1cf525eedc/global-pull-secret-syncer/0.log" Apr 22 18:08:38.704431 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:38.704394 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-85kw2_f47ae572-69e3-4737-9211-cfdba8868f24/konnectivity-agent/0.log" Apr 22 18:08:38.791020 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:38.790989 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-169.ec2.internal_dd997a444939df859d631949c5820f34/haproxy/0.log" Apr 22 18:08:43.272538 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:43.272503 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-dmcng_dc1b9d03-8eea-42c8-bf4c-980377435abd/manager/0.log" Apr 22 18:08:43.377653 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:43.377617 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-7cv4v_c78527fa-b7db-4dd8-909a-b848cca5ac08/manager/0.log" Apr 22 18:08:43.427993 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:43.427964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-rk74b_698e6b0e-c4d5-4d1f-a178-962254f68e1c/manager/0.log" Apr 22 18:08:44.849074 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:44.849039 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5xpcn_58d47ea9-316e-41f3-8a86-33b4f43187fe/node-exporter/0.log" Apr 22 18:08:44.868860 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:44.868836 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5xpcn_58d47ea9-316e-41f3-8a86-33b4f43187fe/kube-rbac-proxy/0.log" Apr 22 18:08:44.888709 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:44.888678 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5xpcn_58d47ea9-316e-41f3-8a86-33b4f43187fe/init-textfile/0.log" Apr 22 18:08:46.852478 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:46.852448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wnwfv_47363b61-021b-4c33-bee9-3f7fb1dc9969/networking-console-plugin/0.log" Apr 22 18:08:47.394975 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.394926 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb"] Apr 22 18:08:47.395259 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395246 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="tokenizer" Apr 22 18:08:47.395303 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395261 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="tokenizer" Apr 22 18:08:47.395303 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395278 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="storage-initializer" Apr 22 18:08:47.395303 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395283 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="storage-initializer" Apr 22 18:08:47.395303 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395291 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="main" Apr 22 18:08:47.395303 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395297 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="main" Apr 22 18:08:47.395483 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395381 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="main" Apr 22 18:08:47.395483 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.395393 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="865953d6-0531-4f68-a79e-af9d61e7b398" containerName="tokenizer" Apr 22 18:08:47.398681 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.398662 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.400974 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.400954 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"kube-root-ca.crt\"" Apr 22 18:08:47.401822 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.401804 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h8t2w\"/\"default-dockercfg-r2vhp\"" Apr 22 18:08:47.401882 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.401804 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h8t2w\"/\"openshift-service-ca.crt\"" Apr 22 18:08:47.407838 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.407810 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb"] Apr 22 18:08:47.427907 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.427861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-sys\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.428108 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.427933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-proc\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.428108 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.427984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc75c\" (UniqueName: \"kubernetes.io/projected/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-kube-api-access-nc75c\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.428108 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.428042 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-podres\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.428108 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.428100 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-lib-modules\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529397 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-lib-modules\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529397 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-sys\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-proc\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc75c\" (UniqueName: \"kubernetes.io/projected/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-kube-api-access-nc75c\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-podres\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-lib-modules\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-sys\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529583 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-proc\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.529659 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.529601 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-podres\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.537472 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.537448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc75c\" (UniqueName: \"kubernetes.io/projected/fc08a4b8-15d5-437b-b6e3-a6aaec97b153-kube-api-access-nc75c\") pod \"perf-node-gather-daemonset-8qkcb\" (UID: \"fc08a4b8-15d5-437b-b6e3-a6aaec97b153\") " pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.709200 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.709108 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:47.831832 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:47.831799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb"] Apr 22 18:08:47.835277 ip-10-0-133-169 kubenswrapper[2568]: W0422 18:08:47.835247 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfc08a4b8_15d5_437b_b6e3_a6aaec97b153.slice/crio-647fd18076cdd2a54339f92f8b9d17177db07f00cbf6bebbd14e6d62eead34cd WatchSource:0}: Error finding container 647fd18076cdd2a54339f92f8b9d17177db07f00cbf6bebbd14e6d62eead34cd: Status 404 returned error can't find the container with id 647fd18076cdd2a54339f92f8b9d17177db07f00cbf6bebbd14e6d62eead34cd Apr 22 18:08:48.281517 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:48.281474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" event={"ID":"fc08a4b8-15d5-437b-b6e3-a6aaec97b153","Type":"ContainerStarted","Data":"7b86ce5f0305dd2939810d854a864c6a061a8f0165c06ccaded936405aac406e"} Apr 22 18:08:48.281517 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:48.281519 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" event={"ID":"fc08a4b8-15d5-437b-b6e3-a6aaec97b153","Type":"ContainerStarted","Data":"647fd18076cdd2a54339f92f8b9d17177db07f00cbf6bebbd14e6d62eead34cd"} Apr 22 18:08:48.281984 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:48.281612 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:48.298509 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:48.298443 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" podStartSLOduration=1.298424281 podStartE2EDuration="1.298424281s" podCreationTimestamp="2026-04-22 18:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:48.295704505 +0000 UTC m=+2084.277350454" watchObservedRunningTime="2026-04-22 18:08:48.298424281 +0000 UTC m=+2084.280070234" Apr 22 18:08:48.995987 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:48.995953 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5n8fn_21fe7e60-7b3d-470d-8f9f-63da9bbccfa6/dns/0.log" Apr 22 18:08:49.015966 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:49.015924 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5n8fn_21fe7e60-7b3d-470d-8f9f-63da9bbccfa6/kube-rbac-proxy/0.log" Apr 22 18:08:49.171336 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:49.171302 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-snnsr_4c36ca16-a7c9-4be1-a04f-3fe4cbe924fc/dns-node-resolver/0.log" Apr 22 18:08:49.627421 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:49.627395 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c6mcn_413cd8ee-53a3-4bf3-9251-11b54262bb83/node-ca/0.log" Apr 22 18:08:50.443385 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:50.443354 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vtxh4_af5eb2e4-1b98-43d1-98e5-57c92f8736ee/discovery/0.log" Apr 22 18:08:50.974762 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:50.974732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dp46v_3abfaf4a-ddaa-41da-ad6c-d710aa0ba979/serve-healthcheck-canary/0.log" Apr 22 18:08:51.522571 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:51.522538 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w688w_39adb942-f356-4943-b0ff-5b516ca93173/kube-rbac-proxy/0.log" Apr 22 18:08:51.542749 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:51.542719 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w688w_39adb942-f356-4943-b0ff-5b516ca93173/exporter/0.log" Apr 22 18:08:51.561932 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:51.561902 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w688w_39adb942-f356-4943-b0ff-5b516ca93173/extractor/0.log" Apr 22 18:08:54.294782 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:54.294754 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h8t2w/perf-node-gather-daemonset-8qkcb" Apr 22 18:08:54.683559 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:54.683509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-869c67c9d6-pd2z7_251aa3df-ddac-4db4-ab20-cfcee419d68a/manager/0.log" Apr 22 18:08:54.894542 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:54.894509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-8g9bd_ef2d974f-7c57-4dd1-8c0c-45d19af7904c/manager/0.log" Apr 22 18:08:54.938921 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:08:54.938820 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-tg77x_d8e023c5-7fd2-487c-a12d-6fb177d643f6/seaweedfs/0.log" Apr 22 18:09:00.916332 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:00.916301 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/kube-multus-additional-cni-plugins/0.log" Apr 22 18:09:00.937118 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:00.937084 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/egress-router-binary-copy/0.log" Apr 22 18:09:00.958118 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:00.958073 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/cni-plugins/0.log" Apr 22 18:09:00.978041 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:00.978012 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/bond-cni-plugin/0.log" Apr 22 18:09:01.002983 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.002956 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/routeoverride-cni/0.log" Apr 22 18:09:01.026484 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.026455 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/whereabouts-cni-bincopy/0.log" Apr 22 18:09:01.053319 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.053287 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9nhgg_51c5cdbf-e98b-4e0a-bdb5-c1cdbe7f1a81/whereabouts-cni/0.log" Apr 22 18:09:01.260218 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.260136 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jckwf_edd47fac-d678-4368-a928-a3ac85b7a40a/kube-multus/0.log" Apr 22 18:09:01.350933 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.350903 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cb4rf_71cf0a1f-9d8d-4195-9355-be900422df45/network-metrics-daemon/0.log" Apr 22 18:09:01.368746 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:01.368714 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cb4rf_71cf0a1f-9d8d-4195-9355-be900422df45/kube-rbac-proxy/0.log" Apr 22 18:09:02.842499 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.842464 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-controller/0.log" Apr 22 18:09:02.858820 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.858787 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 18:09:02.867561 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.867534 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/1.log" Apr 22 18:09:02.886560 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.886523 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/kube-rbac-proxy-node/0.log" Apr 22 18:09:02.905964 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.905918 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:09:02.922773 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.922751 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/northd/0.log" Apr 22 18:09:02.947282 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.947256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/nbdb/0.log" Apr 22 18:09:02.967307 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:02.967275 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/sbdb/0.log" Apr 22 18:09:03.063426 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:03.063393 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovnkube-controller/0.log" Apr 22 18:09:04.148103 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:04.148069 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6wq9d_bca926f3-b1c0-45e1-9eb2-e0d0fc51b178/network-check-target-container/0.log" Apr 22 18:09:04.624511 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:04.624485 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log" Apr 22 18:09:04.628343 ip-10-0-133-169 kubenswrapper[2568]: I0422 18:09:04.628324 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vckbs_1ffbd3d2-a676-442d-8a27-08dac0cc37fe/ovn-acl-logging/0.log"