Apr 16 18:02:15.014809 ip-10-0-128-209 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:15.496505 ip-10-0-128-209 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:15.496505 ip-10-0-128-209 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:15.496505 ip-10-0-128-209 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:15.496505 ip-10-0-128-209 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:15.496505 ip-10-0-128-209 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:15.497606 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.497445 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:15.502112 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502089 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:15.502112 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502108 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:15.502112 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502112 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:15.502112 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502115 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:15.502112 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502119 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502122 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502125 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502127 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502130 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502134 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502139 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502142 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502144 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502148 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502151 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502153 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502156 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502160 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502163 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502166 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502168 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502171 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502174 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:15.502325 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502176 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502179 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502181 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502184 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502186 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502189 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502201 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502204 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502207 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502209 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502212 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502215 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502218 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502220 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502223 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502226 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502228 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502231 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502234 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502237 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:15.502785 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502239 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502241 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502244 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502247 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502249 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502252 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502255 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502257 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502260 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502262 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502265 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502267 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502269 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502272 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502274 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502277 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502279 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502281 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502284 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502286 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:15.503255 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502289 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502292 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502294 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502298 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502300 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502303 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502306 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502309 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502311 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502314 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502316 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502319 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502322 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502324 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502327 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502331 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502334 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502337 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502339 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:15.503743 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502343 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502347 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502349 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502352 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502733 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502739 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502742 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502744 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502747 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502750 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502753 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502755 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502758 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502761 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502763 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502766 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502769 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502771 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502774 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502777 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:15.504185 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502780 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502783 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502785 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502788 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502791 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502793 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502796 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502800 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502804 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502807 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502810 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502813 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502815 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502818 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502821 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502824 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502826 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502829 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502831 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:15.504673 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502834 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502836 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502839 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502842 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502844 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502847 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502849 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502852 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502854 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502857 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502859 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502862 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502865 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502867 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502886 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502889 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502892 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502895 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502898 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502901 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:15.505124 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502904 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502907 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502910 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502913 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502915 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502918 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502920 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502924 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502927 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502930 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502933 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502935 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502938 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502940 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502943 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502946 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502948 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502951 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502953 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502956 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:15.505620 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502958 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502960 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502963 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502965 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502968 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502972 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502974 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502977 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502979 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502981 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.502984 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503065 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503072 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503079 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503083 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503088 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503091 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503095 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503100 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503103 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503106 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:15.506106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503109 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503112 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503115 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503118 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503121 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503124 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503127 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503129 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503132 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503136 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503139 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503142 2572 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503144 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503147 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503152 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503155 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503158 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503161 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503164 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503166 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503169 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503172 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503175 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503178 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503181 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:15.506624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503184 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503187 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503191 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503194 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503198 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503201 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503204 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503206 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503209 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503213 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503216 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503219 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503222 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503225 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503228 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503231 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503233 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503236 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503239 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503242 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503246 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503249 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503254 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503257 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503261 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:15.507195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503264 2572 flags.go:64] FLAG: --help="false" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503267 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503270 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503273 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503276 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503279 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503282 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503285 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503288 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503290 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503294 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503297 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503299 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503302 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503305 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503308 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503311 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503314 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503317 2572 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503320 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503323 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503325 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503330 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:15.507935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503333 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503336 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503339 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503341 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503344 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503347 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503352 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503356 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503359 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503363 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503366 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503369 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503372 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503375 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503377 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503380 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503383 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503390 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503393 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503395 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503401 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503404 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503409 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503412 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:15.508818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503415 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503417 2572 flags.go:64] FLAG: --port="10250" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503421 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503423 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01c501b32f4b95057" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503426 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503429 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503432 2572 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503435 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503437 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503441 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503443 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503446 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503449 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503452 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503456 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503459 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503462 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503465 2572 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503467 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503470 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503473 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503476 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503478 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503482 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503485 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503487 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:15.509895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503490 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503493 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503495 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503499 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503501 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503504 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503507 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503525 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503531 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503534 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503537 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503540 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503543 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503545 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503548 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503551 2572 flags.go:64] FLAG: --v="2" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503555 2572 flags.go:64] FLAG: --version="false" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503559 2572 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503563 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.503566 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503658 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503662 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503665 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503667 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:15.510680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503670 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503673 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503676 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503680 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503683 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503686 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503689 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503692 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503694 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503697 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503700 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503703 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503707 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503709 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503712 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503714 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503718 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503721 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503723 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:15.511238 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503726 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503728 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503731 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503733 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503736 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503738 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503741 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503744 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503746 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503750 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503753 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503756 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503759 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503761 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503764 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503766 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503769 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503771 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503774 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503777 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:15.511720 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503779 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503782 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503784 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503787 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503789 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503792 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503795 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503797 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503800 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503803 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503806 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503809 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503811 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503813 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503816 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503819 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503821 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503824 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503826 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503828 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:15.512194 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503831 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503834 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503837 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503839 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503842 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503844 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503846 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503849 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503851 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503854 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503856 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503859 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503861 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503864 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503866 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503869 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503871 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503873 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503876 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:15.512697 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503879 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503881 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503885 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.503888 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.504488 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.511920 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.511935 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511983 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511987 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511990 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511993 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511996 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.511999 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512002 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512005 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512008 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:15.513176 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512010 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512013 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512016 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512018 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512021 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512025 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512029 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512033 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512036 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512039 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512041 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512044 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512047 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512049 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512052 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512055 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512058 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512061 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512063 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512066 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:15.513606 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512069 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512071 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512074 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512076 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512079 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512081 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512084 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512086 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512089 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512092 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512096 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512099 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512102 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512104 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512107 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512110 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512112 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512115 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512118 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:15.514064 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512120 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512123 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512126 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512128 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512131 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512133 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512136 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512139 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512142 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512144 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512147 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512150 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512152 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512155 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512157 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512159 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512162 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512165 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512167 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512169 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:15.514524 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512172 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512174 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512177 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512179 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512182 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512184 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512186 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512189 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512191 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512194 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512196 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512199 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512202 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512205 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512208 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512210 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512213 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:15.515000 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512215 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.512220 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512318 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512323 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512326 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512330 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512334 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512337 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512339 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512342 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512344 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512347 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512349 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512352 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512354 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:15.515394 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512357 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512360 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512362 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512364 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512367 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512369 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512372 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512374 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512376 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512379 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512381 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512384 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512386 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512389 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512391 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512394 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512396 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512399 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512401 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512403 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:15.515779 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512406 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512410 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512413 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512416 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512429 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512432 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512435 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512437 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512441 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512443 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512446 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512448 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512450 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512453 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512455 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512458 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512460 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512463 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512465 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:15.516254 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512467 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512470 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512472 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512474 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512477 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512479 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512482 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512485 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512487 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512490 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512492 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512495 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512497 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512500 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512503 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512505 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512507 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512525 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512530 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512533 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:15.516723 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512536 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512539 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512541 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512544 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512546 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512549 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512551 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512554 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512556 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512559 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512561 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512564 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512566 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:15.512568 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.512573 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:15.517189 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.512672 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:15.517566 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.515420 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:15.517566 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.516437 2572 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:15.517566 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.516536 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:15.517566 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.516575 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:15.545028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.545010 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:15.547744 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.547717 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:15.565228 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.565205 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:15.571200 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.571184 2572 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:15.571735 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.571720 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:15.573063 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.573050 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:15.578063 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.578043 2572 fs.go:135] Filesystem UUIDs: map[0aca6b75-abf1-493b-bad4-756d39b74e40:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e44b5be7-c6f6-4d1a-9cb5-3cc53ddbf632:/dev/nvme0n1p3] Apr 16 18:02:15.578141 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.578064 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:15.583508 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.582922 2572 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:15.581566099 +0000 UTC m=+0.440283407 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3157373 MemoryCapacity:33164476416 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bfd37857b35bfd003b760c9c55786 SystemUUID:ec2bfd37-857b-35bf-d003-b760c9c55786 BootID:d9210dfe-9c89-4917-8f82-d2f6a8fdf119 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582238208 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:7b:5d:fd:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:7b:5d:fd:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:16:8f:c9:01:2c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164476416 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:15.583508 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.583497 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:15.583675 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.583608 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:15.584857 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.584829 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:15.585020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.584859 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-209.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:15.585094 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.585033 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:15.585094 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.585045 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:15.585094 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.585062 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:15.585094 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.585082 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:15.586646 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.586633 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:15.586799 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.586788 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:15.589338 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.589326 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:15.589392 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.589350 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:15.590297 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.590286 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:15.590346 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.590304 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:15.590346 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.590317 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:15.591547 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.591534 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:15.591617 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.591557 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:15.594709 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.594696 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:15.596581 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.596565 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:15.596978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.596962 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6qsg8" Apr 16 18:02:15.597994 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.597980 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598001 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598010 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598023 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598032 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598041 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598049 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598057 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:15.598068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598066 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:15.598308 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598075 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:15.598308 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598094 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:15.598308 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.598108 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:15.599920 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.599909 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:15.599978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.599924 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:15.602763 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.602744 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-209.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:15.602875 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.602858 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-209.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:15.602979 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.602965 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:15.603239 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.603226 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:15.603287 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.603267 2572 server.go:1295] "Started kubelet" Apr 16 18:02:15.603355 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.603331 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:15.603395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.603350 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6qsg8" Apr 16 18:02:15.604018 ip-10-0-128-209 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:15.604879 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.604592 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:15.604879 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.604677 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:15.605717 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.605696 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:15.608157 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.608143 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:15.611373 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.611359 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:15.611450 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.611372 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:15.612060 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612048 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:15.612170 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612153 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:15.612252 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612059 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:15.612424 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612402 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:15.612474 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612425 2572 factory.go:55] Registering systemd factory Apr 16 18:02:15.612474 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.612435 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:15.613573 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.613548 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:15.613661 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613638 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:15.613661 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613652 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:15.613987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613926 2572 factory.go:153] Registering CRI-O factory Apr 16 18:02:15.613987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613940 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:15.613987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613963 2572 factory.go:103] Registering Raw factory Apr 16 18:02:15.613987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.613979 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:15.614150 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.614119 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:15.614575 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.614559 2572 manager.go:319] Starting recovery of all containers Apr 16 18:02:15.614746 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.614724 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:15.618840 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.618819 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-209.ec2.internal\" not found" node="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.623842 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.623754 2572 manager.go:324] Recovery completed Apr 16 18:02:15.624837 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.624821 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:02:15.627442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.627431 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.629669 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.629655 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.629742 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.629680 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.629742 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.629690 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.630088 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.630075 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:15.630088 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.630087 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:15.630165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.630104 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:15.632209 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.632198 2572 policy_none.go:49] "None policy: Start" Apr 16 18:02:15.632249 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.632213 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:15.632249 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.632222 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:15.669047 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669030 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.669068 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669081 2572 server.go:85] "Starting device plugin registration server" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669302 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669314 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669861 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669923 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.669931 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.669982 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:15.678254 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.670016 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:15.737742 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.737695 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:15.738899 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.738874 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:15.738987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.738904 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:15.738987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.738926 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:15.738987 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.738932 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:15.738987 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.738971 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:15.742083 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.742068 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:15.770573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.770508 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.771347 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.771331 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.771409 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.771364 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.771409 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.771376 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.771409 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.771398 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.779634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.779619 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.779698 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.779639 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-209.ec2.internal\": node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:15.790432 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.790408 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:15.839992 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.839965 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal"] Apr 16 18:02:15.840075 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.840034 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.840868 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.840854 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.840938 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.840883 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.840938 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.840898 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.842195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.842183 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.842334 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.842321 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.842379 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.842347 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.843132 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843116 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.843206 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843140 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.843206 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843153 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.843206 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843174 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.843206 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843184 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.843380 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.843155 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.844926 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.844911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.845000 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.844933 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:15.845539 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.845525 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:15.845634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.845548 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:15.845634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.845559 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:15.867153 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.867136 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-209.ec2.internal\" not found" node="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.871375 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.871361 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-209.ec2.internal\" not found" node="ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.890809 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.890795 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:15.913983 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.913966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.914078 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.913991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.914078 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:15.914009 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9348010412cd69bb166b4f63c170f91-config\") pod \"kube-apiserver-proxy-ip-10-0-128-209.ec2.internal\" (UID: \"a9348010412cd69bb166b4f63c170f91\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:15.991318 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:15.991282 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.014681 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.014749 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014688 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.014749 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9348010412cd69bb166b4f63c170f91-config\") pod \"kube-apiserver-proxy-ip-10-0-128-209.ec2.internal\" (UID: \"a9348010412cd69bb166b4f63c170f91\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.014749 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a9348010412cd69bb166b4f63c170f91-config\") pod \"kube-apiserver-proxy-ip-10-0-128-209.ec2.internal\" (UID: \"a9348010412cd69bb166b4f63c170f91\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.014840 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.014840 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.014742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/34402c7e80e9e6cd1afa932f89941529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal\" (UID: \"34402c7e80e9e6cd1afa932f89941529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.091986 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.091914 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.170568 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.170538 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.174264 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.174248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.192278 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.192254 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.292778 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.292736 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.393286 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.393207 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.493706 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.493669 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-209.ec2.internal\" not found" Apr 16 18:02:16.516042 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.516005 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:16.516494 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.516200 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:16.516494 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.516212 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:16.588437 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.588401 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:16.590606 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.590589 2572 apiserver.go:52] "Watching apiserver" Apr 16 18:02:16.599880 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.599848 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:16.602173 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.602147 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-wdh27","openshift-dns/node-resolver-zdnrz","openshift-multus/multus-9hksp","openshift-multus/network-metrics-daemon-2k4qz","openshift-network-diagnostics/network-check-target-jp7mf","openshift-network-operator/iptables-alerter-f2fhk","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m","openshift-image-registry/node-ca-zbk52","openshift-multus/multus-additional-cni-plugins-vtf28","openshift-ovn-kubernetes/ovnkube-node-x27gf","kube-system/konnectivity-agent-zkhsv"] Apr 16 18:02:16.604553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.604533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.604639 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.604610 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.606634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.606607 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:15 +0000 UTC" deadline="2027-10-16 19:54:31.937126769 +0000 UTC" Apr 16 18:02:16.606634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.606632 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13153h52m15.330497381s" Apr 16 18:02:16.606750 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.606686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.607165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607150 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.607912 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607497 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.607912 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607642 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zxmdm\"" Apr 16 18:02:16.607912 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607698 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rtd5c\"" Apr 16 18:02:16.607912 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607785 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.608129 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.607919 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.608191 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.608176 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:16.608236 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.608212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.608361 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.608331 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:16.608401 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.608357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:16.609434 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.609412 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9f7s6\"" Apr 16 18:02:16.609544 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.609442 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.609712 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.609693 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.609832 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.609734 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:16.609957 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.609938 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:16.610866 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.610840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.611449 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.611435 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:16.611807 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.611792 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.612216 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.612205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.613269 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.613256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.613375 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.613354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.613953 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.613939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:16.614148 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.614135 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.614576 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.614564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.614576 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.614571 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pcbgg\"" Apr 16 18:02:16.615764 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.615735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.616704 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.616685 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:16.616796 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.616737 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.616796 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.616750 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.617253 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-conf-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.617292 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/045160d0-0fd3-47d2-90ec-0bb2af115ef2-hosts-file\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.617333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-modprobe-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.617333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-conf\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-netns\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617362 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617384 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qktfc\"" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-device-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.617423 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617432 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f601d51e-6912-402c-abe7-76ac16678f2a-host\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f601d51e-6912-402c-abe7-76ac16678f2a-serviceca\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhtzk\" (UniqueName: \"kubernetes.io/projected/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-kube-api-access-mhtzk\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-run\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617571 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblfr\" (UniqueName: \"kubernetes.io/projected/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-kube-api-access-lblfr\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-daemon-config\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-k8s-cni-cncf-io\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.617692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-multus\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-registration-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-systemd\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-sys\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8spx\" (UniqueName: \"kubernetes.io/projected/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kube-api-access-t8spx\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617776 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-var-lib-kubelet\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-multus-certs\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-kubernetes\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-os-release\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-host-slash\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-os-release\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-lib-modules\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-cni-binary-copy\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-hostroot\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.618028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-socket-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.617993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-tuned\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-tmp\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-etc-kubernetes\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618053 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlvb\" (UniqueName: \"kubernetes.io/projected/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-kube-api-access-hxlvb\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-iptables-alerter-script\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4ng\" (UniqueName: \"kubernetes.io/projected/f781e5cc-b111-4034-8a85-cae2e3e72a72-kube-api-access-lq4ng\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045160d0-0fd3-47d2-90ec-0bb2af115ef2-tmp-dir\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-sys-fs\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwpl\" (UniqueName: \"kubernetes.io/projected/f601d51e-6912-402c-abe7-76ac16678f2a-kube-api-access-zdwpl\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618191 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-system-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-cnibin\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-socket-dir-parent\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-bin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618289 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6w628\"" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcl5l\" (UniqueName: \"kubernetes.io/projected/d5943e99-4c81-4af3-a008-f184fe0a2d79-kube-api-access-qcl5l\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-etc-selinux\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-system-cni-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysconfig\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-host\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-cnibin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-kubelet\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2xn\" (UniqueName: \"kubernetes.io/projected/045160d0-0fd3-47d2-90ec-0bb2af115ef2-kube-api-access-gt2xn\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.619934 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.618806 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pplkx\"" Apr 16 18:02:16.622177 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:16.622356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:16.622456 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622333 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:16.622456 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:16.622636 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:16.622731 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622685 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:16.622940 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622920 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c8lcb\"" Apr 16 18:02:16.623036 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.622995 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d8fxd\"" Apr 16 18:02:16.623036 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.623004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:16.623682 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.623184 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:16.623682 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.623677 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:16.634348 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.634331 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal"] Apr 16 18:02:16.635651 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.635639 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:16.635710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.635697 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" Apr 16 18:02:16.642935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.642917 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:16.670164 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.670137 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:16.670348 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.670333 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal"] Apr 16 18:02:16.674156 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.674143 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6ml67" Apr 16 18:02:16.683717 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.683702 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6ml67" Apr 16 18:02:16.713604 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.713583 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-sys\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8spx\" (UniqueName: \"kubernetes.io/projected/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kube-api-access-t8spx\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-var-lib-kubelet\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-multus-certs\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovn-node-metrics-cert\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-kubernetes\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-os-release\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.719783 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-host-slash\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-systemd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-os-release\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-lib-modules\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-cni-binary-copy\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-hostroot\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-socket-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-konnectivity-ca\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-tuned\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.719996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-tmp\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-etc-kubernetes\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlvb\" (UniqueName: \"kubernetes.io/projected/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-kube-api-access-hxlvb\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-iptables-alerter-script\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-host-slash\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4ng\" (UniqueName: \"kubernetes.io/projected/f781e5cc-b111-4034-8a85-cae2e3e72a72-kube-api-access-lq4ng\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045160d0-0fd3-47d2-90ec-0bb2af115ef2-tmp-dir\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-multus-certs\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.720259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-sys-fs\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-var-lib-kubelet\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwpl\" (UniqueName: \"kubernetes.io/projected/f601d51e-6912-402c-abe7-76ac16678f2a-kube-api-access-zdwpl\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-socket-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-var-lib-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-lib-modules\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-systemd-units\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-netns\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-os-release\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-os-release\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-etc-kubernetes\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-etc-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-kubernetes\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720556 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-node-log\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-sys-fs\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720638 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-bin\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-script-lib\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-system-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-cnibin\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/045160d0-0fd3-47d2-90ec-0bb2af115ef2-tmp-dir\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-cni-binary-copy\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-hostroot\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-sys\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-socket-dir-parent\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-bin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-iptables-alerter-script\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.720980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcl5l\" (UniqueName: \"kubernetes.io/projected/d5943e99-4c81-4af3-a008-f184fe0a2d79-kube-api-access-qcl5l\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-etc-selinux\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721063 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-system-cni-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.721818 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-socket-dir-parent\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysconfig\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-host\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-bin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-system-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-host\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-system-cni-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysconfig\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-cnibin\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-cnibin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-kubelet\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-cnibin\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2xn\" (UniqueName: \"kubernetes.io/projected/045160d0-0fd3-47d2-90ec-0bb2af115ef2-kube-api-access-gt2xn\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-kubelet\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.722562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-etc-selinux\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-ovn\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-log-socket\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-netd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-conf-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/045160d0-0fd3-47d2-90ec-0bb2af115ef2-hosts-file\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-kubelet\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-modprobe-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-conf-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jsb\" (UniqueName: \"kubernetes.io/projected/533bfb3b-fb81-47d8-a968-aa3baab674a7-kube-api-access-85jsb\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-modprobe-d\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/045160d0-0fd3-47d2-90ec-0bb2af115ef2-hosts-file\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-conf\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-netns\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-device-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-netns\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.723723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.721998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-sysctl-conf\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f601d51e-6912-402c-abe7-76ac16678f2a-host\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-device-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f601d51e-6912-402c-abe7-76ac16678f2a-serviceca\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhtzk\" (UniqueName: \"kubernetes.io/projected/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-kube-api-access-mhtzk\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f601d51e-6912-402c-abe7-76ac16678f2a-host\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-run\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lblfr\" (UniqueName: \"kubernetes.io/projected/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-kube-api-access-lblfr\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-run\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-daemon-config\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-config\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f601d51e-6912-402c-abe7-76ac16678f2a-serviceca\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.724304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722423 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-env-overrides\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-k8s-cni-cncf-io\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-multus\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-registration-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-agent-certs\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-slash\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-systemd\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-systemd\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.722777 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f781e5cc-b111-4034-8a85-cae2e3e72a72-cni-binary-copy\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-daemon-config\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.722855 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:17.222826424 +0000 UTC m=+2.081543722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-var-lib-cni-multus\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-registration-dir\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-host-run-k8s-cni-cncf-io\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.722994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5943e99-4c81-4af3-a008-f184fe0a2d79-multus-cni-dir\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.724847 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.723547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-etc-tuned\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.725336 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.723578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f781e5cc-b111-4034-8a85-cae2e3e72a72-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.725336 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.723919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-tmp\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.729345 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.729327 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:16.729345 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.729344 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:16.729528 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.729353 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:16.729528 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:16.729483 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:17.229443227 +0000 UTC m=+2.088160526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:16.730076 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.730057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8spx\" (UniqueName: \"kubernetes.io/projected/1b73a7d2-ac33-42f6-91e5-c24cbb5b4113-kube-api-access-t8spx\") pod \"aws-ebs-csi-driver-node-5ff9m\" (UID: \"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:16.731617 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.731597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwpl\" (UniqueName: \"kubernetes.io/projected/f601d51e-6912-402c-abe7-76ac16678f2a-kube-api-access-zdwpl\") pod \"node-ca-zbk52\" (UID: \"f601d51e-6912-402c-abe7-76ac16678f2a\") " pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:16.732114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.732095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4ng\" (UniqueName: \"kubernetes.io/projected/f781e5cc-b111-4034-8a85-cae2e3e72a72-kube-api-access-lq4ng\") pod \"multus-additional-cni-plugins-vtf28\" (UID: \"f781e5cc-b111-4034-8a85-cae2e3e72a72\") " pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.734801 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.734773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblfr\" (UniqueName: \"kubernetes.io/projected/914e179f-5ddd-46e1-8fbc-fc98cc6389e5-kube-api-access-lblfr\") pod \"tuned-wdh27\" (UID: \"914e179f-5ddd-46e1-8fbc-fc98cc6389e5\") " pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.735073 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.735054 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlvb\" (UniqueName: \"kubernetes.io/projected/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-kube-api-access-hxlvb\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:16.735342 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.735320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2xn\" (UniqueName: \"kubernetes.io/projected/045160d0-0fd3-47d2-90ec-0bb2af115ef2-kube-api-access-gt2xn\") pod \"node-resolver-zdnrz\" (UID: \"045160d0-0fd3-47d2-90ec-0bb2af115ef2\") " pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.735629 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.735602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhtzk\" (UniqueName: \"kubernetes.io/projected/40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd-kube-api-access-mhtzk\") pod \"iptables-alerter-f2fhk\" (UID: \"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd\") " pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.735872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.735854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcl5l\" (UniqueName: \"kubernetes.io/projected/d5943e99-4c81-4af3-a008-f184fe0a2d79-kube-api-access-qcl5l\") pod \"multus-9hksp\" (UID: \"d5943e99-4c81-4af3-a008-f184fe0a2d79\") " pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.744207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.744189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vtf28" Apr 16 18:02:16.768485 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.768451 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34402c7e80e9e6cd1afa932f89941529.slice/crio-4a97a4fc92ee96dde8b8ab0f3b11ae2c445e62e0e9860a595044cd81d1b09d63 WatchSource:0}: Error finding container 4a97a4fc92ee96dde8b8ab0f3b11ae2c445e62e0e9860a595044cd81d1b09d63: Status 404 returned error can't find the container with id 4a97a4fc92ee96dde8b8ab0f3b11ae2c445e62e0e9860a595044cd81d1b09d63 Apr 16 18:02:16.768877 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.768852 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf781e5cc_b111_4034_8a85_cae2e3e72a72.slice/crio-fa2895f47655274f5cf36bef8795d29e0e65e0141470a6a61d3c77f63f76537b WatchSource:0}: Error finding container fa2895f47655274f5cf36bef8795d29e0e65e0141470a6a61d3c77f63f76537b: Status 404 returned error can't find the container with id fa2895f47655274f5cf36bef8795d29e0e65e0141470a6a61d3c77f63f76537b Apr 16 18:02:16.773659 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.773646 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:16.787317 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.787298 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9348010412cd69bb166b4f63c170f91.slice/crio-abd5a4d7a4f83c1b97213b60e5490a68005206c18803b0dda44a80d990c84e98 WatchSource:0}: Error finding container abd5a4d7a4f83c1b97213b60e5490a68005206c18803b0dda44a80d990c84e98: Status 404 returned error can't find the container with id abd5a4d7a4f83c1b97213b60e5490a68005206c18803b0dda44a80d990c84e98 Apr 16 18:02:16.823010 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.822984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-var-lib-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-systemd-units\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-netns\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-etc-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-var-lib-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823085 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-systemd-units\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-node-log\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823140 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-bin\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-node-log\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823101 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-netns\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-script-lib\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-etc-openvswitch\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-bin\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-ovn\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-log-socket\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-netd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-ovn\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-kubelet\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85jsb\" (UniqueName: \"kubernetes.io/projected/533bfb3b-fb81-47d8-a968-aa3baab674a7-kube-api-access-85jsb\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-log-socket\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-kubelet\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-cni-netd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.823500 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-config\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-env-overrides\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-agent-certs\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-slash\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovn-node-metrics-cert\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-systemd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-script-lib\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-host-slash\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-konnectivity-ca\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.823755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/533bfb3b-fb81-47d8-a968-aa3baab674a7-run-systemd\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824207 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.824040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovnkube-config\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824687 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.824663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/533bfb3b-fb81-47d8-a968-aa3baab674a7-env-overrides\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.824789 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.824768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-konnectivity-ca\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.825792 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.825776 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/533bfb3b-fb81-47d8-a968-aa3baab674a7-ovn-node-metrics-cert\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.825865 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.825819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b8b228d4-bea7-4887-8dd2-672c2f8c5e45-agent-certs\") pod \"konnectivity-agent-zkhsv\" (UID: \"b8b228d4-bea7-4887-8dd2-672c2f8c5e45\") " pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:16.831678 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.831663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jsb\" (UniqueName: \"kubernetes.io/projected/533bfb3b-fb81-47d8-a968-aa3baab674a7-kube-api-access-85jsb\") pod \"ovnkube-node-x27gf\" (UID: \"533bfb3b-fb81-47d8-a968-aa3baab674a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:16.871122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.871100 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:16.927803 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.927731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wdh27" Apr 16 18:02:16.933726 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.933700 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914e179f_5ddd_46e1_8fbc_fc98cc6389e5.slice/crio-a26e79bc140b9d4164ac61717430d001097e958cf84d5710779f3f081be7c5bd WatchSource:0}: Error finding container a26e79bc140b9d4164ac61717430d001097e958cf84d5710779f3f081be7c5bd: Status 404 returned error can't find the container with id a26e79bc140b9d4164ac61717430d001097e958cf84d5710779f3f081be7c5bd Apr 16 18:02:16.941894 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.941875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zdnrz" Apr 16 18:02:16.948535 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.948494 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045160d0_0fd3_47d2_90ec_0bb2af115ef2.slice/crio-39ff44c0bf054132461d4d8f209c733cc05f6f345465ebf8040e623ec04d3bcb WatchSource:0}: Error finding container 39ff44c0bf054132461d4d8f209c733cc05f6f345465ebf8040e623ec04d3bcb: Status 404 returned error can't find the container with id 39ff44c0bf054132461d4d8f209c733cc05f6f345465ebf8040e623ec04d3bcb Apr 16 18:02:16.949672 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.949641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hksp" Apr 16 18:02:16.955429 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.955408 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5943e99_4c81_4af3_a008_f184fe0a2d79.slice/crio-c7e3e46b1f236a9140cb5a0c8c95f4fa7f2b7f9384afbb62e067c229fceff312 WatchSource:0}: Error finding container c7e3e46b1f236a9140cb5a0c8c95f4fa7f2b7f9384afbb62e067c229fceff312: Status 404 returned error can't find the container with id c7e3e46b1f236a9140cb5a0c8c95f4fa7f2b7f9384afbb62e067c229fceff312 Apr 16 18:02:16.983179 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.983155 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2fhk" Apr 16 18:02:16.989558 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:16.989538 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40ee05ef_cbcc_43c7_8d8e_d8c52630c3cd.slice/crio-dbd6a73b21f74578addbbe1528d5fef24ca6c6fde6504fe209e0c657fcafccd9 WatchSource:0}: Error finding container dbd6a73b21f74578addbbe1528d5fef24ca6c6fde6504fe209e0c657fcafccd9: Status 404 returned error can't find the container with id dbd6a73b21f74578addbbe1528d5fef24ca6c6fde6504fe209e0c657fcafccd9 Apr 16 18:02:16.998581 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:16.998562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" Apr 16 18:02:17.003840 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:17.003821 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b73a7d2_ac33_42f6_91e5_c24cbb5b4113.slice/crio-6562174b52d8edf84327cf378445b9a697a7d893c81382994ba634efdc748117 WatchSource:0}: Error finding container 6562174b52d8edf84327cf378445b9a697a7d893c81382994ba634efdc748117: Status 404 returned error can't find the container with id 6562174b52d8edf84327cf378445b9a697a7d893c81382994ba634efdc748117 Apr 16 18:02:17.014716 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.014700 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zbk52" Apr 16 18:02:17.061063 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.061035 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:17.065616 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.065592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:17.067212 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:17.067188 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533bfb3b_fb81_47d8_a968_aa3baab674a7.slice/crio-68ceba9b90187f87d8923e404fede5aebef76512b658e16552b100d8a8954d65 WatchSource:0}: Error finding container 68ceba9b90187f87d8923e404fede5aebef76512b658e16552b100d8a8954d65: Status 404 returned error can't find the container with id 68ceba9b90187f87d8923e404fede5aebef76512b658e16552b100d8a8954d65 Apr 16 18:02:17.071680 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:17.071661 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b228d4_bea7_4887_8dd2_672c2f8c5e45.slice/crio-ffbcdd237a53d9085d2bc2a6e38bedc6541b4fefe065322614e496e3123ccfa0 WatchSource:0}: Error finding container ffbcdd237a53d9085d2bc2a6e38bedc6541b4fefe065322614e496e3123ccfa0: Status 404 returned error can't find the container with id ffbcdd237a53d9085d2bc2a6e38bedc6541b4fefe065322614e496e3123ccfa0 Apr 16 18:02:17.227973 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.227882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:17.228123 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.228051 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:17.228190 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.228124 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:18.228103433 +0000 UTC m=+3.086820744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:17.328324 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.328287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:17.328499 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.328458 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:17.328499 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.328480 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:17.328499 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.328494 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:17.328667 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.328571 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:18.328550504 +0000 UTC m=+3.187267803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:17.563821 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.563505 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:17.631949 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.631919 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:17.685339 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.685294 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:16 +0000 UTC" deadline="2027-09-30 14:13:45.169862566 +0000 UTC" Apr 16 18:02:17.685339 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.685337 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12764h11m27.484529558s" Apr 16 18:02:17.742299 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.740434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:17.742299 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:17.740567 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:17.764250 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.764034 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zkhsv" event={"ID":"b8b228d4-bea7-4887-8dd2-672c2f8c5e45","Type":"ContainerStarted","Data":"ffbcdd237a53d9085d2bc2a6e38bedc6541b4fefe065322614e496e3123ccfa0"} Apr 16 18:02:17.775804 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.775764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" event={"ID":"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113","Type":"ContainerStarted","Data":"6562174b52d8edf84327cf378445b9a697a7d893c81382994ba634efdc748117"} Apr 16 18:02:17.781606 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.781570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2fhk" event={"ID":"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd","Type":"ContainerStarted","Data":"dbd6a73b21f74578addbbe1528d5fef24ca6c6fde6504fe209e0c657fcafccd9"} Apr 16 18:02:17.798956 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.797466 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hksp" event={"ID":"d5943e99-4c81-4af3-a008-f184fe0a2d79","Type":"ContainerStarted","Data":"c7e3e46b1f236a9140cb5a0c8c95f4fa7f2b7f9384afbb62e067c229fceff312"} Apr 16 18:02:17.812890 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.812769 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wdh27" event={"ID":"914e179f-5ddd-46e1-8fbc-fc98cc6389e5","Type":"ContainerStarted","Data":"a26e79bc140b9d4164ac61717430d001097e958cf84d5710779f3f081be7c5bd"} Apr 16 18:02:17.835732 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.835639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" event={"ID":"a9348010412cd69bb166b4f63c170f91","Type":"ContainerStarted","Data":"abd5a4d7a4f83c1b97213b60e5490a68005206c18803b0dda44a80d990c84e98"} Apr 16 18:02:17.838838 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.838807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"68ceba9b90187f87d8923e404fede5aebef76512b658e16552b100d8a8954d65"} Apr 16 18:02:17.866005 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.865969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zbk52" event={"ID":"f601d51e-6912-402c-abe7-76ac16678f2a","Type":"ContainerStarted","Data":"66d8733bd6dfe3975b4c770182d2d1dcd66c3d455b3beffc03c8d670101ce12e"} Apr 16 18:02:17.891652 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.891564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zdnrz" event={"ID":"045160d0-0fd3-47d2-90ec-0bb2af115ef2","Type":"ContainerStarted","Data":"39ff44c0bf054132461d4d8f209c733cc05f6f345465ebf8040e623ec04d3bcb"} Apr 16 18:02:17.919972 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.919927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerStarted","Data":"fa2895f47655274f5cf36bef8795d29e0e65e0141470a6a61d3c77f63f76537b"} Apr 16 18:02:17.927224 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:17.927165 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" event={"ID":"34402c7e80e9e6cd1afa932f89941529","Type":"ContainerStarted","Data":"4a97a4fc92ee96dde8b8ab0f3b11ae2c445e62e0e9860a595044cd81d1b09d63"} Apr 16 18:02:18.236933 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:18.236852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:18.237086 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.237067 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:18.237244 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.237221 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:20.237196157 +0000 UTC m=+5.095913454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:18.338907 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:18.338248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:18.338907 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.338434 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:18.338907 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.338454 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:18.338907 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.338467 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:18.338907 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.338543 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:20.338505767 +0000 UTC m=+5.197223066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:18.685853 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:18.685767 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:16 +0000 UTC" deadline="2027-12-22 18:28:11.343289186 +0000 UTC" Apr 16 18:02:18.685853 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:18.685805 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14760h25m52.657488585s" Apr 16 18:02:18.739425 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:18.739391 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:18.739616 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:18.739541 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:19.740281 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:19.740253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:19.740763 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:19.740374 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:20.255611 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:20.254977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:20.255611 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.255173 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:20.255611 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.255236 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:24.255218179 +0000 UTC m=+9.113935490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:20.356633 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:20.356021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:20.356633 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.356203 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:20.356633 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.356221 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:20.356633 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.356233 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:20.356633 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.356287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:24.356269683 +0000 UTC m=+9.214986980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:20.739992 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:20.739668 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:20.739992 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:20.739824 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:21.740415 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:21.740155 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:21.740415 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:21.740308 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:22.739843 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:22.739807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:22.740029 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:22.739948 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:23.739969 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:23.739852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:23.740404 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:23.739972 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:24.286488 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:24.286449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:24.286680 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.286635 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:24.286777 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.286743 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:32.286689462 +0000 UTC m=+17.145406779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:24.387707 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:24.387676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:24.387872 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.387833 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:24.387872 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.387850 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:24.387872 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.387862 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:24.388036 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.387921 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:32.387902849 +0000 UTC m=+17.246620155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:24.739732 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:24.739653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:24.739891 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:24.739803 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:25.740275 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:25.740239 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:25.740748 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:25.740357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:26.739230 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:26.739144 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:26.739393 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:26.739284 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:27.739663 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:27.739628 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:27.740115 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:27.739760 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:28.739150 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:28.739117 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:28.739319 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:28.739236 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:29.739272 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:29.739246 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:29.739622 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:29.739337 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:30.739508 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:30.739476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:30.739932 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:30.739617 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:31.739498 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:31.739468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:31.739686 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:31.739598 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:32.349484 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:32.349446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:32.349704 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.349608 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:32.349704 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.349664 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.349647893 +0000 UTC m=+33.208365193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:32.450228 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:32.450191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:32.450394 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.450356 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:32.450394 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.450373 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:32.450394 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.450383 2572 projected.go:194] Error preparing data for projected volume kube-api-access-9cdps for pod openshift-network-diagnostics/network-check-target-jp7mf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:32.450535 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.450450 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps podName:164700ca-d6d4-4aee-86e6-4fca944bb4b5 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.450431888 +0000 UTC m=+33.309149206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9cdps" (UniqueName: "kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps") pod "network-check-target-jp7mf" (UID: "164700ca-d6d4-4aee-86e6-4fca944bb4b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:32.739387 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:32.739315 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:32.739550 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:32.739442 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:33.739604 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:33.739571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:33.740018 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:33.739704 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:34.739534 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.739330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:34.739653 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:34.739626 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:34.955551 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.955524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hksp" event={"ID":"d5943e99-4c81-4af3-a008-f184fe0a2d79","Type":"ContainerStarted","Data":"8a143250825eecf3116376075b96bca0fa540ef3522b4d1fcfbd9d26b54a31a5"} Apr 16 18:02:34.957123 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.957099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wdh27" event={"ID":"914e179f-5ddd-46e1-8fbc-fc98cc6389e5","Type":"ContainerStarted","Data":"ba2e1a62b11ce06fd77387f6ed48c57a001777311aefe332b1b3078f8f3cd181"} Apr 16 18:02:34.958540 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.958504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" event={"ID":"a9348010412cd69bb166b4f63c170f91","Type":"ContainerStarted","Data":"8a808768fa9f0382c8d43f2584a84e7889900defabc5c4aa88e178c11550fffd"} Apr 16 18:02:34.960882 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.960867 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:02:34.961183 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961163 2572 generic.go:358] "Generic (PLEG): container finished" podID="533bfb3b-fb81-47d8-a968-aa3baab674a7" containerID="4672b0d34f23b1c1a577469cf76d5ffa476541834a248b63c8cd90de3c2980f3" exitCode=1 Apr 16 18:02:34.961245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"42ca86cab18667446f2ddd517a8bf3e67758acc3486adc39e737a25df966e75d"} Apr 16 18:02:34.961245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"defc7c0cbda5ace7562a5cd96a246f7edcc12a5d5e064f28203034d70210a422"} Apr 16 18:02:34.961245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"47003bc6e9196a9d8145f75479ea288371d7588d62e3b42695f9ca653462b226"} Apr 16 18:02:34.961245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"d3408a3143fafba655c2f6ab84a60e8eed6b4d3504c84f589c5fab5490b2736f"} Apr 16 18:02:34.961245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerDied","Data":"4672b0d34f23b1c1a577469cf76d5ffa476541834a248b63c8cd90de3c2980f3"} Apr 16 18:02:34.961427 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.961251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"8fdaeb364484ca3e3d2d6141d140700de6179a46c10775ade6fc33648094bbd1"} Apr 16 18:02:34.982551 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:34.981995 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9hksp" podStartSLOduration=2.6548916 podStartE2EDuration="19.981979548s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:16.956778534 +0000 UTC m=+1.815495829" lastFinishedPulling="2026-04-16 18:02:34.283866482 +0000 UTC m=+19.142583777" observedRunningTime="2026-04-16 18:02:34.981942427 +0000 UTC m=+19.840659746" watchObservedRunningTime="2026-04-16 18:02:34.981979548 +0000 UTC m=+19.840696937" Apr 16 18:02:35.039248 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.039192 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-209.ec2.internal" podStartSLOduration=19.039170143 podStartE2EDuration="19.039170143s" podCreationTimestamp="2026-04-16 18:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:35.01050663 +0000 UTC m=+19.869223947" watchObservedRunningTime="2026-04-16 18:02:35.039170143 +0000 UTC m=+19.897887538" Apr 16 18:02:35.740968 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.740771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:35.741500 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:35.741057 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:35.793264 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.793243 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:02:35.965661 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.965550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zbk52" event={"ID":"f601d51e-6912-402c-abe7-76ac16678f2a","Type":"ContainerStarted","Data":"3fee0902c134ac2a2d29fa62c88c34dd317d39a122f4828418cde1d4e10a6d18"} Apr 16 18:02:35.966953 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.966926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zdnrz" event={"ID":"045160d0-0fd3-47d2-90ec-0bb2af115ef2","Type":"ContainerStarted","Data":"8d62a456327188d64c774a25a2f50b36f483ebdc182d8141cc7bbc6ce6f1fc23"} Apr 16 18:02:35.968086 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.968068 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="9a476e3b5ef597842890f0b2aba1f6b3b92d88d17f9d5452d2c0728006fd4877" exitCode=0 Apr 16 18:02:35.968170 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.968125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"9a476e3b5ef597842890f0b2aba1f6b3b92d88d17f9d5452d2c0728006fd4877"} Apr 16 18:02:35.969408 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.969361 2572 generic.go:358] "Generic (PLEG): container finished" podID="34402c7e80e9e6cd1afa932f89941529" containerID="06d9cd1e477e959336760f4ab1296f27f832b1928ded5427d1eb8f137cd2b65e" exitCode=0 Apr 16 18:02:35.969461 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.969427 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" event={"ID":"34402c7e80e9e6cd1afa932f89941529","Type":"ContainerDied","Data":"06d9cd1e477e959336760f4ab1296f27f832b1928ded5427d1eb8f137cd2b65e"} Apr 16 18:02:35.970665 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.970642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zkhsv" event={"ID":"b8b228d4-bea7-4887-8dd2-672c2f8c5e45","Type":"ContainerStarted","Data":"2f2ea6be5abcff97bd020fff35f2f4442d3d3d6a3c06df0a12950b2cfe914ef6"} Apr 16 18:02:35.972154 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.972132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" event={"ID":"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113","Type":"ContainerStarted","Data":"84bb7ac0bd7381b480b82d0326051eb57fef08d8d7d6f5e2e7b2f38d787a377d"} Apr 16 18:02:35.972154 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.972155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" event={"ID":"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113","Type":"ContainerStarted","Data":"a8e702363ef5ed35ff272ab70142d9982f4474d8178d27282e7604e061a4dd22"} Apr 16 18:02:35.973296 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.973272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2fhk" event={"ID":"40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd","Type":"ContainerStarted","Data":"2038f4cb91076ed6a63faa3b248e7e7cfb6ac579f7452fe0b67d607c1aee0626"} Apr 16 18:02:35.986835 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:35.986794 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wdh27" podStartSLOduration=3.655717666 podStartE2EDuration="20.986781122s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:16.935279218 +0000 UTC m=+1.793996514" lastFinishedPulling="2026-04-16 18:02:34.266342667 +0000 UTC m=+19.125059970" observedRunningTime="2026-04-16 18:02:35.039361724 +0000 UTC m=+19.898079042" watchObservedRunningTime="2026-04-16 18:02:35.986781122 +0000 UTC m=+20.845498439" Apr 16 18:02:36.016113 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.016064 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zbk52" podStartSLOduration=3.772681199 podStartE2EDuration="21.016050336s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:17.021381564 +0000 UTC m=+1.880098860" lastFinishedPulling="2026-04-16 18:02:34.264750692 +0000 UTC m=+19.123467997" observedRunningTime="2026-04-16 18:02:35.989051643 +0000 UTC m=+20.847768959" watchObservedRunningTime="2026-04-16 18:02:36.016050336 +0000 UTC m=+20.874767656" Apr 16 18:02:36.044167 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.044129 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zdnrz" podStartSLOduration=3.788099856 podStartE2EDuration="21.044116832s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:16.951005638 +0000 UTC m=+1.809722933" lastFinishedPulling="2026-04-16 18:02:34.207022596 +0000 UTC m=+19.065739909" observedRunningTime="2026-04-16 18:02:36.016207184 +0000 UTC m=+20.874924501" watchObservedRunningTime="2026-04-16 18:02:36.044116832 +0000 UTC m=+20.902834161" Apr 16 18:02:36.044319 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.044300 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f2fhk" podStartSLOduration=3.8279854909999997 podStartE2EDuration="21.044296374s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:16.990724173 +0000 UTC m=+1.849441469" lastFinishedPulling="2026-04-16 18:02:34.207035046 +0000 UTC m=+19.065752352" observedRunningTime="2026-04-16 18:02:36.043894479 +0000 UTC m=+20.902611795" watchObservedRunningTime="2026-04-16 18:02:36.044296374 +0000 UTC m=+20.903013699" Apr 16 18:02:36.150147 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.150098 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zkhsv" podStartSLOduration=4.016037563 podStartE2EDuration="21.150085599s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:17.073011154 +0000 UTC m=+1.931728453" lastFinishedPulling="2026-04-16 18:02:34.207059179 +0000 UTC m=+19.065776489" observedRunningTime="2026-04-16 18:02:36.149314527 +0000 UTC m=+21.008031843" watchObservedRunningTime="2026-04-16 18:02:36.150085599 +0000 UTC m=+21.008802915" Apr 16 18:02:36.680667 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.680558 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:02:35.793257812Z","UUID":"d19a824c-0b7e-4c1f-b173-d04052aefd8a","Handler":null,"Name":"","Endpoint":""} Apr 16 18:02:36.682598 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.682569 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:02:36.682598 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.682601 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:02:36.740241 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.740218 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:36.740369 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:36.740343 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:36.978877 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.978802 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:02:36.979304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.979194 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"09b2f7e9d953f3a1dbeb9df1cf98428eb3aedd8d3e7700a441152999d51dfd98"} Apr 16 18:02:36.981088 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.981064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" event={"ID":"34402c7e80e9e6cd1afa932f89941529","Type":"ContainerStarted","Data":"55dfeec0cdf03e2918371657ea3ad503afa2153e66f82f58df1840e858088e93"} Apr 16 18:02:36.983124 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.983052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" event={"ID":"1b73a7d2-ac33-42f6-91e5-c24cbb5b4113","Type":"ContainerStarted","Data":"f4d8d47660291971bc717f7458f17dd5d53b7aecdee2794c88ae7eb82482efd3"} Apr 16 18:02:36.998085 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:36.998051 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-209.ec2.internal" podStartSLOduration=20.99803915 podStartE2EDuration="20.99803915s" podCreationTimestamp="2026-04-16 18:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:36.997590398 +0000 UTC m=+21.856307725" watchObservedRunningTime="2026-04-16 18:02:36.99803915 +0000 UTC m=+21.856756467" Apr 16 18:02:37.018618 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:37.018573 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5ff9m" podStartSLOduration=2.384601995 podStartE2EDuration="22.018564678s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:17.005280298 +0000 UTC m=+1.863997593" lastFinishedPulling="2026-04-16 18:02:36.639242981 +0000 UTC m=+21.497960276" observedRunningTime="2026-04-16 18:02:37.018226453 +0000 UTC m=+21.876943770" watchObservedRunningTime="2026-04-16 18:02:37.018564678 +0000 UTC m=+21.877281993" Apr 16 18:02:37.740063 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:37.740036 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:37.740411 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:37.740140 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:38.739086 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:38.739053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:38.739560 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:38.739163 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:39.739338 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.739314 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:39.739620 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:39.739424 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:39.991633 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.991475 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:02:39.991959 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.991934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"7427074ab557849393d3a4558994e591074fa1b88a7823c4688f990ccae53ca0"} Apr 16 18:02:39.992210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.992190 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:39.992428 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.992402 2572 scope.go:117] "RemoveContainer" containerID="4672b0d34f23b1c1a577469cf76d5ffa476541834a248b63c8cd90de3c2980f3" Apr 16 18:02:39.993572 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:39.993542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerStarted","Data":"0c0527980a184b5ab51f070b7214739c15807b4060ee111b2e08497718ec2fcc"} Apr 16 18:02:40.005635 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.005618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:40.225642 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.225574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:40.226272 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.226256 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:40.739654 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.739631 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:40.740164 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:40.739733 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:40.998445 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.998365 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:02:40.998760 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.998734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" event={"ID":"533bfb3b-fb81-47d8-a968-aa3baab674a7","Type":"ContainerStarted","Data":"1f8d3b9d830af051eec9688fb1491f437b77693663564cf6c87141667dbf00ee"} Apr 16 18:02:40.999922 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.999895 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:41.000041 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:40.999936 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:41.004543 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.002612 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="0c0527980a184b5ab51f070b7214739c15807b4060ee111b2e08497718ec2fcc" exitCode=0 Apr 16 18:02:41.013746 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.013712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"0c0527980a184b5ab51f070b7214739c15807b4060ee111b2e08497718ec2fcc"} Apr 16 18:02:41.016662 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.016639 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:41.020072 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.019111 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zkhsv" Apr 16 18:02:41.034780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.034744 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:02:41.064432 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.064392 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" podStartSLOduration=8.788672381 podStartE2EDuration="26.064382354s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:17.068841205 +0000 UTC m=+1.927558500" lastFinishedPulling="2026-04-16 18:02:34.344551177 +0000 UTC m=+19.203268473" observedRunningTime="2026-04-16 18:02:41.062667585 +0000 UTC m=+25.921384901" watchObservedRunningTime="2026-04-16 18:02:41.064382354 +0000 UTC m=+25.923099670" Apr 16 18:02:41.670414 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.670239 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jp7mf"] Apr 16 18:02:41.670577 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.670530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:41.670635 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:41.670614 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:41.678693 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.678668 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2k4qz"] Apr 16 18:02:41.678810 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:41.678769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:41.678864 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:41.678845 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:42.006605 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:42.006508 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="b8c48bfb987944c36f39f1b6eda3de9b358f599a0f680d7e8a023396f2b26eb1" exitCode=0 Apr 16 18:02:42.007048 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:42.006610 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"b8c48bfb987944c36f39f1b6eda3de9b358f599a0f680d7e8a023396f2b26eb1"} Apr 16 18:02:42.739361 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:42.739335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:42.739479 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:42.739457 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:43.010296 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:43.010266 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="fbfbf4df5ae9b798a6bf40bc8598eb031107c51fcf1f69ddc9fb53d74927a9ca" exitCode=0 Apr 16 18:02:43.010709 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:43.010349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"fbfbf4df5ae9b798a6bf40bc8598eb031107c51fcf1f69ddc9fb53d74927a9ca"} Apr 16 18:02:43.740231 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:43.740204 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:43.740359 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:43.740327 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:44.739321 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:44.739274 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:44.739781 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:44.739415 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:45.740779 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:45.740746 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:45.741166 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:45.740877 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jp7mf" podUID="164700ca-d6d4-4aee-86e6-4fca944bb4b5" Apr 16 18:02:46.739902 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:46.739869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:46.740069 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:46.740011 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:02:47.457556 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.457527 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-209.ec2.internal" event="NodeReady" Apr 16 18:02:47.457962 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.457653 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:02:47.508804 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.508779 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wmkzd"] Apr 16 18:02:47.541970 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.541941 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kqct2"] Apr 16 18:02:47.542103 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.541995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.544914 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.544893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:02:47.545024 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.544917 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:02:47.545024 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.544969 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:02:47.560172 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.560151 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wmkzd"] Apr 16 18:02:47.560268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.560180 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqct2"] Apr 16 18:02:47.560304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.560276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:47.563164 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.563141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:02:47.563164 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.563162 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:02:47.563395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.563381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:02:47.563459 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.563411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:02:47.663244 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:47.663395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.663395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/237a647b-4edf-4b65-ad09-e3f76a13c168-config-volume\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.663395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72nk\" (UniqueName: \"kubernetes.io/projected/f8b0f748-a8be-4032-a386-74c3dc7ad240-kube-api-access-g72nk\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:47.663395 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663385 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/237a647b-4edf-4b65-ad09-e3f76a13c168-tmp-dir\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.663548 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.663414 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm4bp\" (UniqueName: \"kubernetes.io/projected/237a647b-4edf-4b65-ad09-e3f76a13c168-kube-api-access-tm4bp\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.739805 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.739728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:47.743079 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.743057 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j756c\"" Apr 16 18:02:47.743188 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.743092 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:02:47.750430 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.750412 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:02:47.764586 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g72nk\" (UniqueName: \"kubernetes.io/projected/f8b0f748-a8be-4032-a386-74c3dc7ad240-kube-api-access-g72nk\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:47.764678 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/237a647b-4edf-4b65-ad09-e3f76a13c168-tmp-dir\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.764816 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm4bp\" (UniqueName: \"kubernetes.io/projected/237a647b-4edf-4b65-ad09-e3f76a13c168-kube-api-access-tm4bp\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.764880 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:47.764930 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.764990 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.764941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/237a647b-4edf-4b65-ad09-e3f76a13c168-tmp-dir\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.764990 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:47.764976 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:47.765081 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:47.764994 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:47.765081 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:47.765031 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.265012118 +0000 UTC m=+33.123729414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:02:47.765081 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:47.765051 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.265040834 +0000 UTC m=+33.123758130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:02:47.765081 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.765071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/237a647b-4edf-4b65-ad09-e3f76a13c168-config-volume\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.765465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.765449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/237a647b-4edf-4b65-ad09-e3f76a13c168-config-volume\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.779023 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.778903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm4bp\" (UniqueName: \"kubernetes.io/projected/237a647b-4edf-4b65-ad09-e3f76a13c168-kube-api-access-tm4bp\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:47.779113 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:47.778994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72nk\" (UniqueName: \"kubernetes.io/projected/f8b0f748-a8be-4032-a386-74c3dc7ad240-kube-api-access-g72nk\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:48.269444 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.269405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:48.269643 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.269463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:48.269643 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.269540 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:48.269643 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.269632 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:48.269788 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.269634 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:49.269609654 +0000 UTC m=+34.128326954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:02:48.269788 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.269699 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:49.269682754 +0000 UTC m=+34.128400050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:02:48.370062 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.370029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:48.370259 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.370181 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:48.370259 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:48.370245 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:20.370230917 +0000 UTC m=+65.228948212 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:48.470432 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.470401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:48.472912 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.472891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cdps\" (UniqueName: \"kubernetes.io/projected/164700ca-d6d4-4aee-86e6-4fca944bb4b5-kube-api-access-9cdps\") pod \"network-check-target-jp7mf\" (UID: \"164700ca-d6d4-4aee-86e6-4fca944bb4b5\") " pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:48.649159 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.649128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:48.739180 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.739156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:02:48.742205 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.742186 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:02:48.742355 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.742336 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:02:48.863642 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:48.863613 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jp7mf"] Apr 16 18:02:48.867048 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:48.867024 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164700ca_d6d4_4aee_86e6_4fca944bb4b5.slice/crio-0d8d04beb04dc2de25e2ada1b4cd77ac603cbbf5cd0bc0c5893cba15473a0950 WatchSource:0}: Error finding container 0d8d04beb04dc2de25e2ada1b4cd77ac603cbbf5cd0bc0c5893cba15473a0950: Status 404 returned error can't find the container with id 0d8d04beb04dc2de25e2ada1b4cd77ac603cbbf5cd0bc0c5893cba15473a0950 Apr 16 18:02:49.023896 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:49.023868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerStarted","Data":"57e32f7e4f5b6907bc6eeaea1d918543ae7625b97ed826bbb166164cd9e2f334"} Apr 16 18:02:49.024983 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:49.024958 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jp7mf" event={"ID":"164700ca-d6d4-4aee-86e6-4fca944bb4b5","Type":"ContainerStarted","Data":"0d8d04beb04dc2de25e2ada1b4cd77ac603cbbf5cd0bc0c5893cba15473a0950"} Apr 16 18:02:49.275779 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:49.275706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:49.275779 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:49.275746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:49.275982 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:49.275844 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:49.275982 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:49.275917 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.275902506 +0000 UTC m=+36.134619802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:02:49.275982 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:49.275846 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:49.275982 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:49.275976 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:51.275964893 +0000 UTC m=+36.134682188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:02:50.029922 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:50.029875 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="57e32f7e4f5b6907bc6eeaea1d918543ae7625b97ed826bbb166164cd9e2f334" exitCode=0 Apr 16 18:02:50.030829 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:50.029937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"57e32f7e4f5b6907bc6eeaea1d918543ae7625b97ed826bbb166164cd9e2f334"} Apr 16 18:02:51.034754 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:51.034572 2572 generic.go:358] "Generic (PLEG): container finished" podID="f781e5cc-b111-4034-8a85-cae2e3e72a72" containerID="0c9ef94ad2d0ac075437b859e868168a64ab133af80f835b6fabe5f0b492c668" exitCode=0 Apr 16 18:02:51.034754 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:51.034653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerDied","Data":"0c9ef94ad2d0ac075437b859e868168a64ab133af80f835b6fabe5f0b492c668"} Apr 16 18:02:51.291723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:51.291648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:51.291723 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:51.291690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:51.291917 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:51.291803 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:51.291917 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:51.291811 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:51.291917 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:51.291863 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:55.291847052 +0000 UTC m=+40.150564347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:02:51.291917 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:51.291877 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:55.291870975 +0000 UTC m=+40.150588271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:02:52.039747 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:52.039717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vtf28" event={"ID":"f781e5cc-b111-4034-8a85-cae2e3e72a72","Type":"ContainerStarted","Data":"5664e392b6aeb25ca00e5af9fc771fe4aa0770e768cf6ab4f38bd1c13cca6adc"} Apr 16 18:02:52.064653 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:52.064609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vtf28" podStartSLOduration=5.106230517 podStartE2EDuration="37.064594449s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:16.773844264 +0000 UTC m=+1.632561560" lastFinishedPulling="2026-04-16 18:02:48.732208191 +0000 UTC m=+33.590925492" observedRunningTime="2026-04-16 18:02:52.062726868 +0000 UTC m=+36.921444187" watchObservedRunningTime="2026-04-16 18:02:52.064594449 +0000 UTC m=+36.923311765" Apr 16 18:02:53.042695 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.042659 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jp7mf" event={"ID":"164700ca-d6d4-4aee-86e6-4fca944bb4b5","Type":"ContainerStarted","Data":"2ecb2ffa8976f51953329a894c99cbc2d2c009732b2ec087ddd2a661d72b9056"} Apr 16 18:02:53.043142 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.042822 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:02:53.064065 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.064024 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jp7mf" podStartSLOduration=35.000026091 podStartE2EDuration="38.064012181s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:02:48.869002861 +0000 UTC m=+33.727720156" lastFinishedPulling="2026-04-16 18:02:51.932988951 +0000 UTC m=+36.791706246" observedRunningTime="2026-04-16 18:02:53.062598499 +0000 UTC m=+37.921315815" watchObservedRunningTime="2026-04-16 18:02:53.064012181 +0000 UTC m=+37.922729497" Apr 16 18:02:53.637078 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.637050 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd"] Apr 16 18:02:53.675258 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.675228 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd"] Apr 16 18:02:53.675369 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.675292 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.679563 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.679539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:02:53.679563 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.679559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:02:53.679734 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.679564 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:02:53.679836 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.679822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:02:53.707097 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.707070 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q"] Apr 16 18:02:53.728497 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.728481 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q"] Apr 16 18:02:53.728619 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.728603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.731616 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.731593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:02:53.731715 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.731631 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:02:53.731715 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.731662 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:02:53.731822 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.731721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:02:53.810379 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.810357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmgvh\" (UniqueName: \"kubernetes.io/projected/e58d52da-9882-4f26-9068-6bd896f8e549-kube-api-access-bmgvh\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.810471 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.810399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e58d52da-9882-4f26-9068-6bd896f8e549-tmp\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.810471 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.810435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e58d52da-9882-4f26-9068-6bd896f8e549-klusterlet-config\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.910974 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.910916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e58d52da-9882-4f26-9068-6bd896f8e549-klusterlet-config\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.910974 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.910960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.910980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgvh\" (UniqueName: \"kubernetes.io/projected/e58d52da-9882-4f26-9068-6bd896f8e549-kube-api-access-bmgvh\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.911110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e58d52da-9882-4f26-9068-6bd896f8e549-tmp\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.911110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911075 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14ba27c3-91a6-4157-8692-9b6a6e505b65-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrpj\" (UniqueName: \"kubernetes.io/projected/14ba27c3-91a6-4157-8692-9b6a6e505b65-kube-api-access-7nrpj\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911307 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911307 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911175 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:53.911436 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.911420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e58d52da-9882-4f26-9068-6bd896f8e549-tmp\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.914602 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.914586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e58d52da-9882-4f26-9068-6bd896f8e549-klusterlet-config\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.926523 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.926484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmgvh\" (UniqueName: \"kubernetes.io/projected/e58d52da-9882-4f26-9068-6bd896f8e549-kube-api-access-bmgvh\") pod \"klusterlet-addon-workmgr-59f9dc8ccd-4m9fd\" (UID: \"e58d52da-9882-4f26-9068-6bd896f8e549\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:53.984375 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:53.984353 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:02:54.012206 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012316 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14ba27c3-91a6-4157-8692-9b6a6e505b65-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012316 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrpj\" (UniqueName: \"kubernetes.io/projected/14ba27c3-91a6-4157-8692-9b6a6e505b65-kube-api-access-7nrpj\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012316 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012316 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012529 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.012962 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.012940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14ba27c3-91a6-4157-8692-9b6a6e505b65-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.014553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.014505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.014932 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.014916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-ca\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.015028 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.015012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.015341 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.015319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14ba27c3-91a6-4157-8692-9b6a6e505b65-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.023049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.023028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrpj\" (UniqueName: \"kubernetes.io/projected/14ba27c3-91a6-4157-8692-9b6a6e505b65-kube-api-access-7nrpj\") pod \"cluster-proxy-proxy-agent-7c46c4c674-99b4q\" (UID: \"14ba27c3-91a6-4157-8692-9b6a6e505b65\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.043323 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.043278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:02:54.101990 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.101919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd"] Apr 16 18:02:54.158204 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:54.158181 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q"] Apr 16 18:02:54.160526 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:02:54.160492 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ba27c3_91a6_4157_8692_9b6a6e505b65.slice/crio-ffa96f005ccc985d009e379563ffaa90960da237b4fd03b9ee771d36db311c19 WatchSource:0}: Error finding container ffa96f005ccc985d009e379563ffaa90960da237b4fd03b9ee771d36db311c19: Status 404 returned error can't find the container with id ffa96f005ccc985d009e379563ffaa90960da237b4fd03b9ee771d36db311c19 Apr 16 18:02:55.048282 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:55.048238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" event={"ID":"e58d52da-9882-4f26-9068-6bd896f8e549","Type":"ContainerStarted","Data":"1e825251d90c50689952b5580b408cda57ad91c5ddd6bd6b1c40e5676edc0756"} Apr 16 18:02:55.049495 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:55.049462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerStarted","Data":"ffa96f005ccc985d009e379563ffaa90960da237b4fd03b9ee771d36db311c19"} Apr 16 18:02:55.322311 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:55.322234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:02:55.322311 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:02:55.322295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:02:55.322531 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:55.322440 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:55.322531 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:55.322498 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:03.322479706 +0000 UTC m=+48.181197008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:02:55.323008 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:55.322904 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:55.323008 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:02:55.322960 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:03.322944535 +0000 UTC m=+48.181661834 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:03:00.062094 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:00.062038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerStarted","Data":"19bbf5a66d402dde2ddd33146e5c6bb493c5627e916d6ad0269c698b163d17bf"} Apr 16 18:03:00.063561 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:00.063536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" event={"ID":"e58d52da-9882-4f26-9068-6bd896f8e549","Type":"ContainerStarted","Data":"01057c30108e77f0a125bf1da85679d9adc2f027ea0e2a303c6b0606da2a6e6b"} Apr 16 18:03:00.063791 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:00.063772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:03:00.065653 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:00.065634 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:03:00.085482 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:00.085440 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" podStartSLOduration=1.725532896 podStartE2EDuration="7.085427644s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:02:54.107550106 +0000 UTC m=+38.966267401" lastFinishedPulling="2026-04-16 18:02:59.467444851 +0000 UTC m=+44.326162149" observedRunningTime="2026-04-16 18:03:00.084332785 +0000 UTC m=+44.943050103" watchObservedRunningTime="2026-04-16 18:03:00.085427644 +0000 UTC m=+44.944144959" Apr 16 18:03:02.069242 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:02.069206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerStarted","Data":"c491dec4247a9d68b95b560e36fbc6a0b7c1ade4e8d83ed4ef352c14c77da937"} Apr 16 18:03:02.069242 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:02.069244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerStarted","Data":"a553b25aaac80fe89479a7d574d227dcb49bad82a67f0c8a852c1a7a336de65d"} Apr 16 18:03:03.381370 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:03.381330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:03:03.381781 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:03.381414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:03:03.381781 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:03.381457 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:03.381781 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:03.381507 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:03.381781 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:03.381560 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:19.38153992 +0000 UTC m=+64.240257231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:03:03.381781 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:03.381587 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:19.381570025 +0000 UTC m=+64.240287337 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:03:13.028586 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:13.028556 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x27gf" Apr 16 18:03:13.063419 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:13.063372 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" podStartSLOduration=12.691033713 podStartE2EDuration="20.063360841s" podCreationTimestamp="2026-04-16 18:02:53 +0000 UTC" firstStartedPulling="2026-04-16 18:02:54.162211849 +0000 UTC m=+39.020929144" lastFinishedPulling="2026-04-16 18:03:01.534538967 +0000 UTC m=+46.393256272" observedRunningTime="2026-04-16 18:03:02.095637126 +0000 UTC m=+46.954354443" watchObservedRunningTime="2026-04-16 18:03:13.063360841 +0000 UTC m=+57.922078161" Apr 16 18:03:19.388860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:19.388820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:03:19.389282 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:19.388888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:03:19.389282 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:19.388983 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:19.389282 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:19.388990 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:19.389282 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:19.389041 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:51.389024513 +0000 UTC m=+96.247741808 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:03:19.389282 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:19.389068 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:51.389049189 +0000 UTC m=+96.247766485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:03:20.396990 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:20.396953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:03:20.399955 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:20.399933 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:20.407591 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:20.407568 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:20.407677 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:20.407636 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.407615536 +0000 UTC m=+129.266332837 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : secret "metrics-daemon-secret" not found Apr 16 18:03:24.047498 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:24.047467 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jp7mf" Apr 16 18:03:51.402697 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:51.402655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:03:51.403190 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:03:51.402709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:03:51.403190 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:51.402810 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:51.403190 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:51.402888 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls podName:237a647b-4edf-4b65-ad09-e3f76a13c168 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:55.402867269 +0000 UTC m=+160.261584564 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls") pod "dns-default-wmkzd" (UID: "237a647b-4edf-4b65-ad09-e3f76a13c168") : secret "dns-default-metrics-tls" not found Apr 16 18:03:51.403190 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:51.402808 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:51.403190 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:03:51.402952 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert podName:f8b0f748-a8be-4032-a386-74c3dc7ad240 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:55.40293404 +0000 UTC m=+160.261651344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert") pod "ingress-canary-kqct2" (UID: "f8b0f748-a8be-4032-a386-74c3dc7ad240") : secret "canary-serving-cert" not found Apr 16 18:04:24.414089 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:24.414052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:04:24.414582 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:04:24.414188 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:04:24.414582 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:04:24.414260 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs podName:5c4e7715-635e-4cb8-b891-8d2f74e1ef9c nodeName:}" failed. No retries permitted until 2026-04-16 18:06:26.414244404 +0000 UTC m=+251.272961700 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs") pod "network-metrics-daemon-2k4qz" (UID: "5c4e7715-635e-4cb8-b891-8d2f74e1ef9c") : secret "metrics-daemon-secret" not found Apr 16 18:04:29.282999 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:29.282974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zdnrz_045160d0-0fd3-47d2-90ec-0bb2af115ef2/dns-node-resolver/0.log" Apr 16 18:04:30.280883 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:30.280858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zbk52_f601d51e-6912-402c-abe7-76ac16678f2a/node-ca/0.log" Apr 16 18:04:50.552188 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:04:50.552139 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wmkzd" podUID="237a647b-4edf-4b65-ad09-e3f76a13c168" Apr 16 18:04:50.570262 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:04:50.570244 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kqct2" podUID="f8b0f748-a8be-4032-a386-74c3dc7ad240" Apr 16 18:04:51.305244 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:51.305215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wmkzd" Apr 16 18:04:51.748132 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:04:51.748100 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2k4qz" podUID="5c4e7715-635e-4cb8-b891-8d2f74e1ef9c" Apr 16 18:04:52.283595 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.283569 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qwzcn"] Apr 16 18:04:52.285491 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.285473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.288351 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.288331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:52.290024 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.290003 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vblvg\"" Apr 16 18:04:52.290024 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.290016 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:52.290154 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.290004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:52.290154 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.290018 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:52.306295 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.306275 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qwzcn"] Apr 16 18:04:52.404134 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.404114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dee71a84-286e-4bb9-bf9b-746f703763c2-data-volume\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.404235 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.404153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dee71a84-286e-4bb9-bf9b-746f703763c2-crio-socket\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.404235 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.404182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dee71a84-286e-4bb9-bf9b-746f703763c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.404309 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.404243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.404309 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.404293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hcz8\" (UniqueName: \"kubernetes.io/projected/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-api-access-5hcz8\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505191 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dee71a84-286e-4bb9-bf9b-746f703763c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505265 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505265 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hcz8\" (UniqueName: \"kubernetes.io/projected/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-api-access-5hcz8\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505265 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dee71a84-286e-4bb9-bf9b-746f703763c2-data-volume\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505388 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dee71a84-286e-4bb9-bf9b-746f703763c2-crio-socket\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505388 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505338 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dee71a84-286e-4bb9-bf9b-746f703763c2-crio-socket\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505650 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dee71a84-286e-4bb9-bf9b-746f703763c2-data-volume\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.505758 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.505743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.507355 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.507340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dee71a84-286e-4bb9-bf9b-746f703763c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.521334 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.521308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hcz8\" (UniqueName: \"kubernetes.io/projected/dee71a84-286e-4bb9-bf9b-746f703763c2-kube-api-access-5hcz8\") pod \"insights-runtime-extractor-qwzcn\" (UID: \"dee71a84-286e-4bb9-bf9b-746f703763c2\") " pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.593780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.593712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qwzcn" Apr 16 18:04:52.705124 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:52.705085 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qwzcn"] Apr 16 18:04:52.708558 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:04:52.708531 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee71a84_286e_4bb9_bf9b_746f703763c2.slice/crio-0aa249b4088f3232b237c96efe7208863f0d3cc57738177fa77141fde542a880 WatchSource:0}: Error finding container 0aa249b4088f3232b237c96efe7208863f0d3cc57738177fa77141fde542a880: Status 404 returned error can't find the container with id 0aa249b4088f3232b237c96efe7208863f0d3cc57738177fa77141fde542a880 Apr 16 18:04:53.311091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:53.311067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qwzcn" event={"ID":"dee71a84-286e-4bb9-bf9b-746f703763c2","Type":"ContainerStarted","Data":"ac5115992633d80bd7ce6c1a75900689f4743824ad80516077ce18c531f3ff31"} Apr 16 18:04:53.311344 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:53.311098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qwzcn" event={"ID":"dee71a84-286e-4bb9-bf9b-746f703763c2","Type":"ContainerStarted","Data":"da365eac7779ccc4a1d19868e3d77ba2f341114c6d285aff7ef968381ea44b00"} Apr 16 18:04:53.311344 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:53.311107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qwzcn" event={"ID":"dee71a84-286e-4bb9-bf9b-746f703763c2","Type":"ContainerStarted","Data":"0aa249b4088f3232b237c96efe7208863f0d3cc57738177fa77141fde542a880"} Apr 16 18:04:55.318000 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.317962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qwzcn" event={"ID":"dee71a84-286e-4bb9-bf9b-746f703763c2","Type":"ContainerStarted","Data":"9f0ce50685a5a39d860444de3d3b081bb71d67c4b7fec9872a69f4e035137487"} Apr 16 18:04:55.338084 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.338042 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qwzcn" podStartSLOduration=1.513125756 podStartE2EDuration="3.338029001s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:52.756963327 +0000 UTC m=+157.615680622" lastFinishedPulling="2026-04-16 18:04:54.581866563 +0000 UTC m=+159.440583867" observedRunningTime="2026-04-16 18:04:55.337590644 +0000 UTC m=+160.196307960" watchObservedRunningTime="2026-04-16 18:04:55.338029001 +0000 UTC m=+160.196746317" Apr 16 18:04:55.425989 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.425958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:04:55.426107 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.426000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:04:55.428310 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.428286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b0f748-a8be-4032-a386-74c3dc7ad240-cert\") pod \"ingress-canary-kqct2\" (UID: \"f8b0f748-a8be-4032-a386-74c3dc7ad240\") " pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:04:55.428408 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.428315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/237a647b-4edf-4b65-ad09-e3f76a13c168-metrics-tls\") pod \"dns-default-wmkzd\" (UID: \"237a647b-4edf-4b65-ad09-e3f76a13c168\") " pod="openshift-dns/dns-default-wmkzd" Apr 16 18:04:55.509172 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.509147 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:04:55.516731 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.516716 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wmkzd" Apr 16 18:04:55.637730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:55.637698 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wmkzd"] Apr 16 18:04:55.641644 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:04:55.641618 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod237a647b_4edf_4b65_ad09_e3f76a13c168.slice/crio-eb5fc4a2789d771cffeb6e18e5b714ce51fb6b40685d4618421b24d5039ead9c WatchSource:0}: Error finding container eb5fc4a2789d771cffeb6e18e5b714ce51fb6b40685d4618421b24d5039ead9c: Status 404 returned error can't find the container with id eb5fc4a2789d771cffeb6e18e5b714ce51fb6b40685d4618421b24d5039ead9c Apr 16 18:04:56.321352 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:56.321317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wmkzd" event={"ID":"237a647b-4edf-4b65-ad09-e3f76a13c168","Type":"ContainerStarted","Data":"eb5fc4a2789d771cffeb6e18e5b714ce51fb6b40685d4618421b24d5039ead9c"} Apr 16 18:04:57.325439 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:57.325405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wmkzd" event={"ID":"237a647b-4edf-4b65-ad09-e3f76a13c168","Type":"ContainerStarted","Data":"d7042c61f7867833b6a5841a5c357c6ffedfc484b6a500bb7129ac8bfb197509"} Apr 16 18:04:57.325439 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:57.325444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wmkzd" event={"ID":"237a647b-4edf-4b65-ad09-e3f76a13c168","Type":"ContainerStarted","Data":"39d3fe717d17bdaefed05dd11d2875bf71dfe994b0f9011616c54d501e79814d"} Apr 16 18:04:57.325871 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:57.325552 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wmkzd" Apr 16 18:04:57.346013 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:57.345964 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wmkzd" podStartSLOduration=129.266617294 podStartE2EDuration="2m10.345951462s" podCreationTimestamp="2026-04-16 18:02:47 +0000 UTC" firstStartedPulling="2026-04-16 18:04:55.643429283 +0000 UTC m=+160.502146578" lastFinishedPulling="2026-04-16 18:04:56.722763448 +0000 UTC m=+161.581480746" observedRunningTime="2026-04-16 18:04:57.34474811 +0000 UTC m=+162.203465427" watchObservedRunningTime="2026-04-16 18:04:57.345951462 +0000 UTC m=+162.204668780" Apr 16 18:04:59.188301 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.188267 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:04:59.190219 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.190203 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.193159 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.193137 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:04:59.194438 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194421 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:04:59.194535 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194440 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:04:59.194535 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194490 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:04:59.194634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194546 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-87wwb\"" Apr 16 18:04:59.194929 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194913 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:04:59.194980 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194929 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:04:59.194980 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.194936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:04:59.199114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.199099 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:04:59.201257 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.201240 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:04:59.351881 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.351990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.352156 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.352014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9skdj\" (UniqueName: \"kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.452771 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.452771 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.452771 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9skdj\" (UniqueName: \"kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.452964 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.452964 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453116 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.452991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453116 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.453034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453484 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.453459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.453602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453722 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.453703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.453781 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.453727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.455031 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.455008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.455218 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.455199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.461174 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.461155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9skdj\" (UniqueName: \"kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj\") pod \"console-7c7c5c44ff-k652s\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.499527 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.499499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:04:59.610692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:04:59.610665 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:04:59.652247 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:04:59.652217 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a845d1_0931_4603_956f_f21f93dce1a3.slice/crio-1517514b30a149d79179db6463cf675a3f3547228155fa4e6d6a94d1710f5561 WatchSource:0}: Error finding container 1517514b30a149d79179db6463cf675a3f3547228155fa4e6d6a94d1710f5561: Status 404 returned error can't find the container with id 1517514b30a149d79179db6463cf675a3f3547228155fa4e6d6a94d1710f5561 Apr 16 18:05:00.064820 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.064765 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" podUID="e58d52da-9882-4f26-9068-6bd896f8e549" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 16 18:05:00.244554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.244524 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-hsnh6"] Apr 16 18:05:00.246630 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.246608 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4mzpt"] Apr 16 18:05:00.246768 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.246752 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.248687 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.248669 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.249589 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.249571 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:05:00.251628 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.251607 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:05:00.251728 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.251638 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:05:00.251728 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.251709 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:05:00.251842 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.251755 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:05:00.251842 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.251780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s82h7\"" Apr 16 18:05:00.252264 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.252134 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:05:00.252264 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.252169 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2kksm\"" Apr 16 18:05:00.252399 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.252375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:05:00.252594 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.252578 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:05:00.252673 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.252593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:05:00.259861 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.259837 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-hsnh6"] Apr 16 18:05:00.333384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.333299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7c5c44ff-k652s" event={"ID":"f1a845d1-0931-4603-956f-f21f93dce1a3","Type":"ContainerStarted","Data":"1517514b30a149d79179db6463cf675a3f3547228155fa4e6d6a94d1710f5561"} Apr 16 18:05:00.334647 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.334618 2572 generic.go:358] "Generic (PLEG): container finished" podID="e58d52da-9882-4f26-9068-6bd896f8e549" containerID="01057c30108e77f0a125bf1da85679d9adc2f027ea0e2a303c6b0606da2a6e6b" exitCode=1 Apr 16 18:05:00.334743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.334669 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" event={"ID":"e58d52da-9882-4f26-9068-6bd896f8e549","Type":"ContainerDied","Data":"01057c30108e77f0a125bf1da85679d9adc2f027ea0e2a303c6b0606da2a6e6b"} Apr 16 18:05:00.335020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.335005 2572 scope.go:117] "RemoveContainer" containerID="01057c30108e77f0a125bf1da85679d9adc2f027ea0e2a303c6b0606da2a6e6b" Apr 16 18:05:00.359284 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359257 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7365984a-9f2d-436f-bea8-7faf76f34ed0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359295 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-metrics-client-ca\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359486 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359486 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-textfile\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359609 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-sys\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359609 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359609 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-wtmp\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wgv\" (UniqueName: \"kubernetes.io/projected/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-api-access-v5wgv\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.359755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-root\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359959 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359772 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-tls\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.359959 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.359796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s78f9\" (UniqueName: \"kubernetes.io/projected/800daf97-3eb8-47d8-abed-a15df1b37ef8-kube-api-access-s78f9\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.460959 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.460934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.461071 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.460965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-textfile\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461071 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.460993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-sys\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461071 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461010 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461214 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-sys\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461214 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.461214 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-wtmp\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461214 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:05:00.461177 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:05:00.461214 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:05:00.461226 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls podName:7365984a-9f2d-436f-bea8-7faf76f34ed0 nodeName:}" failed. No retries permitted until 2026-04-16 18:05:00.961207511 +0000 UTC m=+165.819924806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-hsnh6" (UID: "7365984a-9f2d-436f-bea8-7faf76f34ed0") : secret "kube-state-metrics-tls" not found Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wgv\" (UniqueName: \"kubernetes.io/projected/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-api-access-v5wgv\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-textfile\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-root\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-tls\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-wtmp\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/800daf97-3eb8-47d8-abed-a15df1b37ef8-root\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461442 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s78f9\" (UniqueName: \"kubernetes.io/projected/800daf97-3eb8-47d8-abed-a15df1b37ef8-kube-api-access-s78f9\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.461804 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.461453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7365984a-9f2d-436f-bea8-7faf76f34ed0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.462469 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.462314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.462469 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.462394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-metrics-client-ca\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.462469 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.462399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7365984a-9f2d-436f-bea8-7faf76f34ed0-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.462469 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.462406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.463279 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.462774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.463279 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.463139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/800daf97-3eb8-47d8-abed-a15df1b37ef8-metrics-client-ca\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.466530 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.464059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.466530 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.464187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7365984a-9f2d-436f-bea8-7faf76f34ed0-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.466530 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.464467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-tls\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.467108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.467089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/800daf97-3eb8-47d8-abed-a15df1b37ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.470099 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.470079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wgv\" (UniqueName: \"kubernetes.io/projected/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-api-access-v5wgv\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.470690 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.470674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s78f9\" (UniqueName: \"kubernetes.io/projected/800daf97-3eb8-47d8-abed-a15df1b37ef8-kube-api-access-s78f9\") pod \"node-exporter-4mzpt\" (UID: \"800daf97-3eb8-47d8-abed-a15df1b37ef8\") " pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.562533 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.562493 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mzpt" Apr 16 18:05:00.575775 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:05:00.575741 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800daf97_3eb8_47d8_abed_a15df1b37ef8.slice/crio-cb452367ec3981f84ad475759d5677a6c3c1bff48fc6c6339d4c1179574100fa WatchSource:0}: Error finding container cb452367ec3981f84ad475759d5677a6c3c1bff48fc6c6339d4c1179574100fa: Status 404 returned error can't find the container with id cb452367ec3981f84ad475759d5677a6c3c1bff48fc6c6339d4c1179574100fa Apr 16 18:05:00.968289 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.968250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:00.971602 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:00.971556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7365984a-9f2d-436f-bea8-7faf76f34ed0-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-hsnh6\" (UID: \"7365984a-9f2d-436f-bea8-7faf76f34ed0\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:01.157001 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.156959 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" Apr 16 18:05:01.307491 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.307455 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-hsnh6"] Apr 16 18:05:01.310497 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:05:01.310466 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7365984a_9f2d_436f_bea8_7faf76f34ed0.slice/crio-6921ec733121b3ee6adecf98f0bcc26faebaf095070c5bce5be8414a1fc64c32 WatchSource:0}: Error finding container 6921ec733121b3ee6adecf98f0bcc26faebaf095070c5bce5be8414a1fc64c32: Status 404 returned error can't find the container with id 6921ec733121b3ee6adecf98f0bcc26faebaf095070c5bce5be8414a1fc64c32 Apr 16 18:05:01.338775 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.338738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mzpt" event={"ID":"800daf97-3eb8-47d8-abed-a15df1b37ef8","Type":"ContainerStarted","Data":"05e50ba7d8aa7f6c187e0e5cf1e5a5d8c7c39a1974345e4cb612d472d9acb1d0"} Apr 16 18:05:01.338904 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.338790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mzpt" event={"ID":"800daf97-3eb8-47d8-abed-a15df1b37ef8","Type":"ContainerStarted","Data":"cb452367ec3981f84ad475759d5677a6c3c1bff48fc6c6339d4c1179574100fa"} Apr 16 18:05:01.340396 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.340369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" event={"ID":"e58d52da-9882-4f26-9068-6bd896f8e549","Type":"ContainerStarted","Data":"1438e00e427d97a5816c894edb793ca1e074c8433a9e7c96209f56d937ced82c"} Apr 16 18:05:01.340785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.340753 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:05:01.341384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.341359 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-59f9dc8ccd-4m9fd" Apr 16 18:05:01.341604 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:01.341586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" event={"ID":"7365984a-9f2d-436f-bea8-7faf76f34ed0","Type":"ContainerStarted","Data":"6921ec733121b3ee6adecf98f0bcc26faebaf095070c5bce5be8414a1fc64c32"} Apr 16 18:05:03.349244 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.349156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7c5c44ff-k652s" event={"ID":"f1a845d1-0931-4603-956f-f21f93dce1a3","Type":"ContainerStarted","Data":"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710"} Apr 16 18:05:03.351239 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.351206 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" event={"ID":"7365984a-9f2d-436f-bea8-7faf76f34ed0","Type":"ContainerStarted","Data":"9beec89849838ba3d4bc0ef8c542a956a77952ae2e8ddfdc996084e145319a69"} Apr 16 18:05:03.351370 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.351243 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" event={"ID":"7365984a-9f2d-436f-bea8-7faf76f34ed0","Type":"ContainerStarted","Data":"619e51b8e36209197b522f1a6b8117e51abd5e98cb2ad384d8d306603046824a"} Apr 16 18:05:03.351370 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.351259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" event={"ID":"7365984a-9f2d-436f-bea8-7faf76f34ed0","Type":"ContainerStarted","Data":"03cf13f8c95f39871dc79dc26dd12c105be5a09accf0c41e84645ae859109664"} Apr 16 18:05:03.352701 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.352677 2572 generic.go:358] "Generic (PLEG): container finished" podID="800daf97-3eb8-47d8-abed-a15df1b37ef8" containerID="05e50ba7d8aa7f6c187e0e5cf1e5a5d8c7c39a1974345e4cb612d472d9acb1d0" exitCode=0 Apr 16 18:05:03.352845 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.352743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mzpt" event={"ID":"800daf97-3eb8-47d8-abed-a15df1b37ef8","Type":"ContainerDied","Data":"05e50ba7d8aa7f6c187e0e5cf1e5a5d8c7c39a1974345e4cb612d472d9acb1d0"} Apr 16 18:05:03.382405 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.382366 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c7c5c44ff-k652s" podStartSLOduration=1.656231456 podStartE2EDuration="4.382354319s" podCreationTimestamp="2026-04-16 18:04:59 +0000 UTC" firstStartedPulling="2026-04-16 18:04:59.653995722 +0000 UTC m=+164.512713020" lastFinishedPulling="2026-04-16 18:05:02.380118584 +0000 UTC m=+167.238835883" observedRunningTime="2026-04-16 18:05:03.381915428 +0000 UTC m=+168.240632747" watchObservedRunningTime="2026-04-16 18:05:03.382354319 +0000 UTC m=+168.241071634" Apr 16 18:05:03.406377 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:03.406337 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-hsnh6" podStartSLOduration=1.68348842 podStartE2EDuration="3.406324896s" podCreationTimestamp="2026-04-16 18:05:00 +0000 UTC" firstStartedPulling="2026-04-16 18:05:01.312625189 +0000 UTC m=+166.171342489" lastFinishedPulling="2026-04-16 18:05:03.035461668 +0000 UTC m=+167.894178965" observedRunningTime="2026-04-16 18:05:03.405837062 +0000 UTC m=+168.264554378" watchObservedRunningTime="2026-04-16 18:05:03.406324896 +0000 UTC m=+168.265042213" Apr 16 18:05:04.358483 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.358406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mzpt" event={"ID":"800daf97-3eb8-47d8-abed-a15df1b37ef8","Type":"ContainerStarted","Data":"a0443055375d46ce03e999c34e50cd1097c6df98852010b4bf2330fee5c6ff6f"} Apr 16 18:05:04.358483 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.358453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mzpt" event={"ID":"800daf97-3eb8-47d8-abed-a15df1b37ef8","Type":"ContainerStarted","Data":"f1da0d529120aaa3016a20b7b289d1404b80b7036748ce7cfe705f761f3b3d10"} Apr 16 18:05:04.382446 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.382402 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4mzpt" podStartSLOduration=3.761720143 podStartE2EDuration="4.382389402s" podCreationTimestamp="2026-04-16 18:05:00 +0000 UTC" firstStartedPulling="2026-04-16 18:05:00.578980484 +0000 UTC m=+165.437697793" lastFinishedPulling="2026-04-16 18:05:01.19964975 +0000 UTC m=+166.058367052" observedRunningTime="2026-04-16 18:05:04.381712669 +0000 UTC m=+169.240429985" watchObservedRunningTime="2026-04-16 18:05:04.382389402 +0000 UTC m=+169.241106718" Apr 16 18:05:04.739247 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.739193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:05:04.739247 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.739209 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:05:04.742266 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.742247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:05:04.749651 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.749637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqct2" Apr 16 18:05:04.858067 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:04.858045 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqct2"] Apr 16 18:05:04.860366 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:05:04.860337 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b0f748_a8be_4032_a386_74c3dc7ad240.slice/crio-2b50b60fd7d03216ebe02c52b8e1ed2f2a3f395c0aadd7b9a986bd6e7fef9fe0 WatchSource:0}: Error finding container 2b50b60fd7d03216ebe02c52b8e1ed2f2a3f395c0aadd7b9a986bd6e7fef9fe0: Status 404 returned error can't find the container with id 2b50b60fd7d03216ebe02c52b8e1ed2f2a3f395c0aadd7b9a986bd6e7fef9fe0 Apr 16 18:05:05.361777 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:05.361741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqct2" event={"ID":"f8b0f748-a8be-4032-a386-74c3dc7ad240","Type":"ContainerStarted","Data":"2b50b60fd7d03216ebe02c52b8e1ed2f2a3f395c0aadd7b9a986bd6e7fef9fe0"} Apr 16 18:05:06.365886 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:06.365781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqct2" event={"ID":"f8b0f748-a8be-4032-a386-74c3dc7ad240","Type":"ContainerStarted","Data":"d4d38bc9d807b099f3948649b17f6c6416bbb2a9d3636317cd66a99db5a65b9b"} Apr 16 18:05:06.382950 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:06.382905 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kqct2" podStartSLOduration=138.004213015 podStartE2EDuration="2m19.382891236s" podCreationTimestamp="2026-04-16 18:02:47 +0000 UTC" firstStartedPulling="2026-04-16 18:05:04.862230111 +0000 UTC m=+169.720947406" lastFinishedPulling="2026-04-16 18:05:06.240908331 +0000 UTC m=+171.099625627" observedRunningTime="2026-04-16 18:05:06.381722244 +0000 UTC m=+171.240439562" watchObservedRunningTime="2026-04-16 18:05:06.382891236 +0000 UTC m=+171.241608552" Apr 16 18:05:07.330104 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:07.330077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wmkzd" Apr 16 18:05:08.812260 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:08.812216 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:05:09.500166 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:09.500139 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:05:33.830334 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:33.830277 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c7c5c44ff-k652s" podUID="f1a845d1-0931-4603-956f-f21f93dce1a3" containerName="console" containerID="cri-o://961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710" gracePeriod=15 Apr 16 18:05:34.044869 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.044835 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" podUID="14ba27c3-91a6-4157-8692-9b6a6e505b65" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:34.063106 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.063085 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c7c5c44ff-k652s_f1a845d1-0931-4603-956f-f21f93dce1a3/console/0.log" Apr 16 18:05:34.063203 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.063141 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:05:34.097256 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097194 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9skdj\" (UniqueName: \"kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097256 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097230 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097265 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097314 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097342 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097385 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097417 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca\") pod \"f1a845d1-0931-4603-956f-f21f93dce1a3\" (UID: \"f1a845d1-0931-4603-956f-f21f93dce1a3\") " Apr 16 18:05:34.097785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.097752 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config" (OuterVolumeSpecName: "console-config") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:34.098286 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.098205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:34.098286 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.098239 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:34.098286 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.098250 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca" (OuterVolumeSpecName: "service-ca") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:34.099992 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.099954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:34.100110 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.100090 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:34.100165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.100131 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj" (OuterVolumeSpecName: "kube-api-access-9skdj") pod "f1a845d1-0931-4603-956f-f21f93dce1a3" (UID: "f1a845d1-0931-4603-956f-f21f93dce1a3"). InnerVolumeSpecName "kube-api-access-9skdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:34.198606 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198582 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9skdj\" (UniqueName: \"kubernetes.io/projected/f1a845d1-0931-4603-956f-f21f93dce1a3-kube-api-access-9skdj\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198606 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198602 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198612 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-console-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198622 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-oauth-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198641 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-trusted-ca-bundle\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198650 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f1a845d1-0931-4603-956f-f21f93dce1a3-console-oauth-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.198730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.198659 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f1a845d1-0931-4603-956f-f21f93dce1a3-service-ca\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.438407 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438384 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c7c5c44ff-k652s_f1a845d1-0931-4603-956f-f21f93dce1a3/console/0.log" Apr 16 18:05:34.438508 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438424 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1a845d1-0931-4603-956f-f21f93dce1a3" containerID="961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710" exitCode=2 Apr 16 18:05:34.438508 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438480 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7c5c44ff-k652s" Apr 16 18:05:34.438616 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7c5c44ff-k652s" event={"ID":"f1a845d1-0931-4603-956f-f21f93dce1a3","Type":"ContainerDied","Data":"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710"} Apr 16 18:05:34.438616 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7c5c44ff-k652s" event={"ID":"f1a845d1-0931-4603-956f-f21f93dce1a3","Type":"ContainerDied","Data":"1517514b30a149d79179db6463cf675a3f3547228155fa4e6d6a94d1710f5561"} Apr 16 18:05:34.438616 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.438608 2572 scope.go:117] "RemoveContainer" containerID="961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710" Apr 16 18:05:34.446276 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.446254 2572 scope.go:117] "RemoveContainer" containerID="961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710" Apr 16 18:05:34.446570 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:05:34.446544 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710\": container with ID starting with 961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710 not found: ID does not exist" containerID="961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710" Apr 16 18:05:34.446686 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.446581 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710"} err="failed to get container status \"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710\": rpc error: code = NotFound desc = could not find container \"961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710\": container with ID starting with 961ef8627e8335bd8d9ade4313472f574111ac8e099d8f675748ecb62402f710 not found: ID does not exist" Apr 16 18:05:34.461199 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.461178 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:05:34.465487 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:34.465468 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c7c5c44ff-k652s"] Apr 16 18:05:35.742702 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:35.742674 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a845d1-0931-4603-956f-f21f93dce1a3" path="/var/lib/kubelet/pods/f1a845d1-0931-4603-956f-f21f93dce1a3/volumes" Apr 16 18:05:44.044438 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:44.044398 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" podUID="14ba27c3-91a6-4157-8692-9b6a6e505b65" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:54.044615 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.044573 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" podUID="14ba27c3-91a6-4157-8692-9b6a6e505b65" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:05:54.044953 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.044661 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" Apr 16 18:05:54.045142 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.045124 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c491dec4247a9d68b95b560e36fbc6a0b7c1ade4e8d83ed4ef352c14c77da937"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:05:54.045178 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.045160 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" podUID="14ba27c3-91a6-4157-8692-9b6a6e505b65" containerName="service-proxy" containerID="cri-o://c491dec4247a9d68b95b560e36fbc6a0b7c1ade4e8d83ed4ef352c14c77da937" gracePeriod=30 Apr 16 18:05:54.489743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.489710 2572 generic.go:358] "Generic (PLEG): container finished" podID="14ba27c3-91a6-4157-8692-9b6a6e505b65" containerID="c491dec4247a9d68b95b560e36fbc6a0b7c1ade4e8d83ed4ef352c14c77da937" exitCode=2 Apr 16 18:05:54.489904 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.489778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerDied","Data":"c491dec4247a9d68b95b560e36fbc6a0b7c1ade4e8d83ed4ef352c14c77da937"} Apr 16 18:05:54.489904 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:05:54.489813 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c46c4c674-99b4q" event={"ID":"14ba27c3-91a6-4157-8692-9b6a6e505b65","Type":"ContainerStarted","Data":"45bbd2da6ce2363e2a0fd88653a643cfd399744ec1b0c082958abf155622d6a6"} Apr 16 18:06:26.459285 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:26.459247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:06:26.461479 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:26.461458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c4e7715-635e-4cb8-b891-8d2f74e1ef9c-metrics-certs\") pod \"network-metrics-daemon-2k4qz\" (UID: \"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c\") " pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:06:26.643769 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:26.643740 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:06:26.651490 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:26.651471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2k4qz" Apr 16 18:06:26.780793 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:26.780762 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2k4qz"] Apr 16 18:06:26.783913 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:06:26.783888 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4e7715_635e_4cb8_b891_8d2f74e1ef9c.slice/crio-1ca32b7622f46e0df7d6cad10f661f8ac4d2ba6173c9993ba8f3959c432f3585 WatchSource:0}: Error finding container 1ca32b7622f46e0df7d6cad10f661f8ac4d2ba6173c9993ba8f3959c432f3585: Status 404 returned error can't find the container with id 1ca32b7622f46e0df7d6cad10f661f8ac4d2ba6173c9993ba8f3959c432f3585 Apr 16 18:06:27.568873 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:27.568835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2k4qz" event={"ID":"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c","Type":"ContainerStarted","Data":"1ca32b7622f46e0df7d6cad10f661f8ac4d2ba6173c9993ba8f3959c432f3585"} Apr 16 18:06:28.547095 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.547064 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:06:28.547295 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.547284 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a845d1-0931-4603-956f-f21f93dce1a3" containerName="console" Apr 16 18:06:28.547359 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.547296 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a845d1-0931-4603-956f-f21f93dce1a3" containerName="console" Apr 16 18:06:28.547359 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.547333 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a845d1-0931-4603-956f-f21f93dce1a3" containerName="console" Apr 16 18:06:28.549924 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.549908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.552680 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.552660 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:06:28.552778 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.552700 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-87wwb\"" Apr 16 18:06:28.553782 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.553765 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:06:28.553882 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.553850 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:06:28.553882 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.553856 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:06:28.554806 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.554567 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:06:28.555476 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.555457 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:06:28.555592 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.555570 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:06:28.561485 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.561464 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:06:28.562601 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.562585 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:06:28.572318 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.572300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2k4qz" event={"ID":"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c","Type":"ContainerStarted","Data":"5e15ba43a2a840f319e68eb5de0c0419f0e3f8a5f5cec42eeb9a635637e68345"} Apr 16 18:06:28.572591 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.572322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2k4qz" event={"ID":"5c4e7715-635e-4cb8-b891-8d2f74e1ef9c","Type":"ContainerStarted","Data":"e7f2435fd069528c512681065f05f5296a624bb823988bdad8a755f83b49e08a"} Apr 16 18:06:28.593743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.593706 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2k4qz" podStartSLOduration=252.731581486 podStartE2EDuration="4m13.593694723s" podCreationTimestamp="2026-04-16 18:02:15 +0000 UTC" firstStartedPulling="2026-04-16 18:06:26.785530034 +0000 UTC m=+251.644247329" lastFinishedPulling="2026-04-16 18:06:27.647643256 +0000 UTC m=+252.506360566" observedRunningTime="2026-04-16 18:06:28.592460916 +0000 UTC m=+253.451178232" watchObservedRunningTime="2026-04-16 18:06:28.593694723 +0000 UTC m=+253.452412039" Apr 16 18:06:28.671012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.670989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671102 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skg64\" (UniqueName: \"kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671102 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671175 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671218 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671218 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.671295 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.671230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771450 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771545 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771545 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skg64\" (UniqueName: \"kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771545 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771659 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771659 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.771659 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.771604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.772254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.772227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.772340 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.772279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.772340 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.772320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.772414 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.772402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.773809 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.773785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.773950 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.773928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.780095 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.780076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skg64\" (UniqueName: \"kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64\") pod \"console-57fb785d4-tl58p\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.862318 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.862299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:28.969735 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:28.969598 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:06:28.972001 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:06:28.971972 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa18bdbb_fbf2_473e_8e10_d96c9cef5f7d.slice/crio-4f2e408be79f8ee07ae19485d7c5fe338c22000f5fe3a415cd59bf02007db18e WatchSource:0}: Error finding container 4f2e408be79f8ee07ae19485d7c5fe338c22000f5fe3a415cd59bf02007db18e: Status 404 returned error can't find the container with id 4f2e408be79f8ee07ae19485d7c5fe338c22000f5fe3a415cd59bf02007db18e Apr 16 18:06:29.576158 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:29.576125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57fb785d4-tl58p" event={"ID":"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d","Type":"ContainerStarted","Data":"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e"} Apr 16 18:06:29.576158 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:29.576162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57fb785d4-tl58p" event={"ID":"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d","Type":"ContainerStarted","Data":"4f2e408be79f8ee07ae19485d7c5fe338c22000f5fe3a415cd59bf02007db18e"} Apr 16 18:06:29.595164 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:29.595125 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57fb785d4-tl58p" podStartSLOduration=1.5951106689999999 podStartE2EDuration="1.595110669s" podCreationTimestamp="2026-04-16 18:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:29.593706232 +0000 UTC m=+254.452423559" watchObservedRunningTime="2026-04-16 18:06:29.595110669 +0000 UTC m=+254.453827984" Apr 16 18:06:38.862578 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:38.862490 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:38.862974 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:38.862585 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:38.868384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:38.868362 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:06:39.606950 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:06:39.606923 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:07:15.625450 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:15.625421 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:07:15.626274 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:15.626251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:07:15.631890 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:15.631874 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:07:36.268886 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.268839 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:07:36.271148 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.271126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.293270 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.293245 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:07:36.328745 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.328876 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.328876 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.328876 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.329018 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpgd4\" (UniqueName: \"kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.329018 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.329018 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.328945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429577 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpgd4\" (UniqueName: \"kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.429738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.429689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.430232 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.430201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.430422 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.430401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.430504 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.430401 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.430504 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.430439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.430504 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.430480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.432553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.432532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.432625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.432551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.438440 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.438417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpgd4\" (UniqueName: \"kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4\") pod \"console-c58ff8f8c-n7nzh\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.580791 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.580696 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:36.695658 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.695624 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:07:36.698309 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:07:36.698283 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2d32e9_f079_4a8a_b5e7_fb7921756978.slice/crio-914ef06ac9b27ba09bdd53d52db40b04fbed983703c0d8ddeef9946d3e1fd5cd WatchSource:0}: Error finding container 914ef06ac9b27ba09bdd53d52db40b04fbed983703c0d8ddeef9946d3e1fd5cd: Status 404 returned error can't find the container with id 914ef06ac9b27ba09bdd53d52db40b04fbed983703c0d8ddeef9946d3e1fd5cd Apr 16 18:07:36.699982 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.699968 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:07:36.749098 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:36.749067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58ff8f8c-n7nzh" event={"ID":"4f2d32e9-f079-4a8a-b5e7-fb7921756978","Type":"ContainerStarted","Data":"914ef06ac9b27ba09bdd53d52db40b04fbed983703c0d8ddeef9946d3e1fd5cd"} Apr 16 18:07:37.752555 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:37.752507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58ff8f8c-n7nzh" event={"ID":"4f2d32e9-f079-4a8a-b5e7-fb7921756978","Type":"ContainerStarted","Data":"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302"} Apr 16 18:07:37.771577 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:37.771534 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c58ff8f8c-n7nzh" podStartSLOduration=1.771498345 podStartE2EDuration="1.771498345s" podCreationTimestamp="2026-04-16 18:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:07:37.77065424 +0000 UTC m=+322.629371557" watchObservedRunningTime="2026-04-16 18:07:37.771498345 +0000 UTC m=+322.630215660" Apr 16 18:07:46.581047 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:46.581008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:46.581422 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:46.581092 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:46.585549 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:46.585509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:46.777650 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:46.777621 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:07:46.831333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:07:46.831254 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:08:00.221072 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.221039 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-85dqd"] Apr 16 18:08:00.223333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.223312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.226366 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.226346 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:08:00.232398 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.232374 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-85dqd"] Apr 16 18:08:00.287989 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.287953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-dbus\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.288139 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.287999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/316daf04-aff0-4816-9b83-eae46b1fd37b-original-pull-secret\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.288139 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.288079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-kubelet-config\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.389233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.389202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-dbus\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.389366 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.389246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/316daf04-aff0-4816-9b83-eae46b1fd37b-original-pull-secret\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.389366 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.389285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-kubelet-config\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.389475 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.389385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-kubelet-config\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.389475 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.389389 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/316daf04-aff0-4816-9b83-eae46b1fd37b-dbus\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.391451 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.391425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/316daf04-aff0-4816-9b83-eae46b1fd37b-original-pull-secret\") pod \"global-pull-secret-syncer-85dqd\" (UID: \"316daf04-aff0-4816-9b83-eae46b1fd37b\") " pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.533768 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.533694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-85dqd" Apr 16 18:08:00.649147 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.649115 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-85dqd"] Apr 16 18:08:00.651936 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:08:00.651907 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316daf04_aff0_4816_9b83_eae46b1fd37b.slice/crio-e6c40ced4d13e78137592d3cddc7851220ffc7412daa64e5767b995f6bb35de5 WatchSource:0}: Error finding container e6c40ced4d13e78137592d3cddc7851220ffc7412daa64e5767b995f6bb35de5: Status 404 returned error can't find the container with id e6c40ced4d13e78137592d3cddc7851220ffc7412daa64e5767b995f6bb35de5 Apr 16 18:08:00.810644 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:00.810572 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-85dqd" event={"ID":"316daf04-aff0-4816-9b83-eae46b1fd37b","Type":"ContainerStarted","Data":"e6c40ced4d13e78137592d3cddc7851220ffc7412daa64e5767b995f6bb35de5"} Apr 16 18:08:06.826742 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:06.826707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-85dqd" event={"ID":"316daf04-aff0-4816-9b83-eae46b1fd37b","Type":"ContainerStarted","Data":"57fbfc844f318402f3f482c88cc6189b845fb9c9ea3a66bb44c01b6c80a5ba58"} Apr 16 18:08:06.842860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:06.842816 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-85dqd" podStartSLOduration=1.465759609 podStartE2EDuration="6.842801918s" podCreationTimestamp="2026-04-16 18:08:00 +0000 UTC" firstStartedPulling="2026-04-16 18:08:00.653394209 +0000 UTC m=+345.512111504" lastFinishedPulling="2026-04-16 18:08:06.030436515 +0000 UTC m=+350.889153813" observedRunningTime="2026-04-16 18:08:06.841847651 +0000 UTC m=+351.700564968" watchObservedRunningTime="2026-04-16 18:08:06.842801918 +0000 UTC m=+351.701519279" Apr 16 18:08:11.850389 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:11.850330 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57fb785d4-tl58p" podUID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" containerName="console" containerID="cri-o://7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e" gracePeriod=15 Apr 16 18:08:12.079709 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.079681 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57fb785d4-tl58p_fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d/console/0.log" Apr 16 18:08:12.079814 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.079751 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:08:12.179189 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179157 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179343 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179201 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179343 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179229 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skg64\" (UniqueName: \"kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179343 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179282 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179343 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179309 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179348 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179377 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert\") pod \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\" (UID: \"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d\") " Apr 16 18:08:12.179657 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179563 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca" (OuterVolumeSpecName: "service-ca") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:08:12.179842 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179815 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config" (OuterVolumeSpecName: "console-config") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:08:12.179971 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179852 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:08:12.179971 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.179864 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:08:12.181564 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.181539 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:12.181632 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.181583 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64" (OuterVolumeSpecName: "kube-api-access-skg64") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "kube-api-access-skg64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:12.181737 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.181715 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" (UID: "fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:12.280032 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.279997 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skg64\" (UniqueName: \"kubernetes.io/projected/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-kube-api-access-skg64\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280032 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280026 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-oauth-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280032 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280036 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280045 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-oauth-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280054 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-console-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280062 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-service-ca\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.280233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.280071 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d-trusted-ca-bundle\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:08:12.843035 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843009 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57fb785d4-tl58p_fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d/console/0.log" Apr 16 18:08:12.843210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843047 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" containerID="7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e" exitCode=2 Apr 16 18:08:12.843210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57fb785d4-tl58p" event={"ID":"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d","Type":"ContainerDied","Data":"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e"} Apr 16 18:08:12.843210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843110 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57fb785d4-tl58p" Apr 16 18:08:12.843210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57fb785d4-tl58p" event={"ID":"fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d","Type":"ContainerDied","Data":"4f2e408be79f8ee07ae19485d7c5fe338c22000f5fe3a415cd59bf02007db18e"} Apr 16 18:08:12.843210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.843133 2572 scope.go:117] "RemoveContainer" containerID="7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e" Apr 16 18:08:12.851333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.851127 2572 scope.go:117] "RemoveContainer" containerID="7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e" Apr 16 18:08:12.851573 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:12.851387 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e\": container with ID starting with 7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e not found: ID does not exist" containerID="7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e" Apr 16 18:08:12.851573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.851412 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e"} err="failed to get container status \"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e\": rpc error: code = NotFound desc = could not find container \"7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e\": container with ID starting with 7b8b291ff662584e32099fc66b487ba8a0ef1741623f3ae78b55e7c090e6724e not found: ID does not exist" Apr 16 18:08:12.863393 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.863373 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:08:12.866633 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:12.866606 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57fb785d4-tl58p"] Apr 16 18:08:13.742693 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:13.742663 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" path="/var/lib/kubelet/pods/fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d/volumes" Apr 16 18:08:45.726057 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.726019 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx"] Apr 16 18:08:45.726573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.726303 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" containerName="console" Apr 16 18:08:45.726573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.726316 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" containerName="console" Apr 16 18:08:45.726573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.726363 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa18bdbb-fbf2-473e-8e10-d96c9cef5f7d" containerName="console" Apr 16 18:08:45.728082 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.728067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.731425 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.731392 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-dlvgn\"" Apr 16 18:08:45.731650 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.731633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:08:45.732654 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.732632 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:08:45.732756 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.732685 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:08:45.732756 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.732723 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:08:45.732756 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.732727 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:08:45.743255 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.743230 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx"] Apr 16 18:08:45.818139 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.818099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4p9q\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-kube-api-access-j4p9q\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.818139 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.818142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.818345 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.818258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.919372 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.919324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.919372 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.919373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4p9q\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-kube-api-access-j4p9q\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.919614 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.919393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.919614 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:45.919470 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:08:45.919614 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:45.919492 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:08:45.919614 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:45.919507 2572 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 18:08:45.919614 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:45.919551 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:08:45.919782 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:45.919631 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates podName:64ddc4a6-ee9d-42bb-8f2c-589768f5b155 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:46.4196151 +0000 UTC m=+391.278332400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates") pod "keda-metrics-apiserver-7c9f485588-c72vx" (UID: "64ddc4a6-ee9d-42bb-8f2c-589768f5b155") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 18:08:45.919782 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.919739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:45.936976 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:45.936941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4p9q\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-kube-api-access-j4p9q\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:46.078499 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.078422 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-75krx"] Apr 16 18:08:46.080532 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.080490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.083372 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.083344 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:08:46.090926 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.090904 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-75krx"] Apr 16 18:08:46.222890 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.222856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.223058 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.222932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fg7x\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-kube-api-access-4fg7x\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.323638 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.323608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fg7x\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-kube-api-access-4fg7x\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.323792 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.323652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.323792 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.323761 2572 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 18:08:46.323792 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.323779 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-75krx: secret "keda-admission-webhooks-certs" not found Apr 16 18:08:46.323885 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.323836 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates podName:e47c388b-e249-45f4-9ab1-68fece5fe9b6 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:46.823822826 +0000 UTC m=+391.682540122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates") pod "keda-admission-cf49989db-75krx" (UID: "e47c388b-e249-45f4-9ab1-68fece5fe9b6") : secret "keda-admission-webhooks-certs" not found Apr 16 18:08:46.332599 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.332541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fg7x\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-kube-api-access-4fg7x\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.424313 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.424293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:46.424409 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.424383 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:08:46.424409 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.424394 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:08:46.424409 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.424408 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx: references non-existent secret key: tls.crt Apr 16 18:08:46.424499 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:46.424446 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates podName:64ddc4a6-ee9d-42bb-8f2c-589768f5b155 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:47.424435352 +0000 UTC m=+392.283152648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates") pod "keda-metrics-apiserver-7c9f485588-c72vx" (UID: "64ddc4a6-ee9d-42bb-8f2c-589768f5b155") : references non-existent secret key: tls.crt Apr 16 18:08:46.827356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.827315 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.829650 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.829631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e47c388b-e249-45f4-9ab1-68fece5fe9b6-certificates\") pod \"keda-admission-cf49989db-75krx\" (UID: \"e47c388b-e249-45f4-9ab1-68fece5fe9b6\") " pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:46.991438 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:46.991404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:47.109080 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:47.109053 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-75krx"] Apr 16 18:08:47.112191 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:08:47.112166 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47c388b_e249_45f4_9ab1_68fece5fe9b6.slice/crio-fef4f3e4a8a0ea3de4a31fb3549f4df511992948a5c9c18c4d4196c7c490e69c WatchSource:0}: Error finding container fef4f3e4a8a0ea3de4a31fb3549f4df511992948a5c9c18c4d4196c7c490e69c: Status 404 returned error can't find the container with id fef4f3e4a8a0ea3de4a31fb3549f4df511992948a5c9c18c4d4196c7c490e69c Apr 16 18:08:47.432103 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:47.432013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:47.432256 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:47.432149 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:08:47.432256 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:47.432162 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:08:47.432256 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:47.432179 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx: references non-existent secret key: tls.crt Apr 16 18:08:47.432256 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:47.432236 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates podName:64ddc4a6-ee9d-42bb-8f2c-589768f5b155 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:49.432223136 +0000 UTC m=+394.290940432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates") pod "keda-metrics-apiserver-7c9f485588-c72vx" (UID: "64ddc4a6-ee9d-42bb-8f2c-589768f5b155") : references non-existent secret key: tls.crt Apr 16 18:08:47.933850 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:47.933815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-75krx" event={"ID":"e47c388b-e249-45f4-9ab1-68fece5fe9b6","Type":"ContainerStarted","Data":"fef4f3e4a8a0ea3de4a31fb3549f4df511992948a5c9c18c4d4196c7c490e69c"} Apr 16 18:08:49.450245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:49.450218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:49.450599 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:49.450372 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:08:49.450599 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:49.450394 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:08:49.450599 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:49.450416 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx: references non-existent secret key: tls.crt Apr 16 18:08:49.450599 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:08:49.450476 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates podName:64ddc4a6-ee9d-42bb-8f2c-589768f5b155 nodeName:}" failed. No retries permitted until 2026-04-16 18:08:53.450458813 +0000 UTC m=+398.309176125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates") pod "keda-metrics-apiserver-7c9f485588-c72vx" (UID: "64ddc4a6-ee9d-42bb-8f2c-589768f5b155") : references non-existent secret key: tls.crt Apr 16 18:08:49.940530 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:49.940485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-75krx" event={"ID":"e47c388b-e249-45f4-9ab1-68fece5fe9b6","Type":"ContainerStarted","Data":"626b60fc83acb1411ed3e3d8a81e7d98b3cbcbbf9188373a82f3b0864d561fd9"} Apr 16 18:08:49.958262 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:49.958223 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-75krx" podStartSLOduration=1.752439302 podStartE2EDuration="3.958210783s" podCreationTimestamp="2026-04-16 18:08:46 +0000 UTC" firstStartedPulling="2026-04-16 18:08:47.113374033 +0000 UTC m=+391.972091328" lastFinishedPulling="2026-04-16 18:08:49.319145513 +0000 UTC m=+394.177862809" observedRunningTime="2026-04-16 18:08:49.957326637 +0000 UTC m=+394.816043955" watchObservedRunningTime="2026-04-16 18:08:49.958210783 +0000 UTC m=+394.816928099" Apr 16 18:08:50.943331 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:50.943291 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:08:53.483949 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:53.483899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:53.486364 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:53.486339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64ddc4a6-ee9d-42bb-8f2c-589768f5b155-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c72vx\" (UID: \"64ddc4a6-ee9d-42bb-8f2c-589768f5b155\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:53.538294 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:53.538259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:53.653338 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:53.653309 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx"] Apr 16 18:08:53.655690 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:08:53.655667 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ddc4a6_ee9d_42bb_8f2c_589768f5b155.slice/crio-a9b84b46642d2d132648f550ffe8d22246829b19556f6e38909501d70c0833b8 WatchSource:0}: Error finding container a9b84b46642d2d132648f550ffe8d22246829b19556f6e38909501d70c0833b8: Status 404 returned error can't find the container with id a9b84b46642d2d132648f550ffe8d22246829b19556f6e38909501d70c0833b8 Apr 16 18:08:53.952328 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:53.952290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" event={"ID":"64ddc4a6-ee9d-42bb-8f2c-589768f5b155","Type":"ContainerStarted","Data":"a9b84b46642d2d132648f550ffe8d22246829b19556f6e38909501d70c0833b8"} Apr 16 18:08:55.959583 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:55.959548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" event={"ID":"64ddc4a6-ee9d-42bb-8f2c-589768f5b155","Type":"ContainerStarted","Data":"c40448f88b06f871554495e434a6531933c1c0018fc4d130a7a7e23b7c6045f4"} Apr 16 18:08:55.959997 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:55.959663 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:08:55.978861 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:08:55.978812 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" podStartSLOduration=8.790116439 podStartE2EDuration="10.97879352s" podCreationTimestamp="2026-04-16 18:08:45 +0000 UTC" firstStartedPulling="2026-04-16 18:08:53.65698306 +0000 UTC m=+398.515700355" lastFinishedPulling="2026-04-16 18:08:55.845660135 +0000 UTC m=+400.704377436" observedRunningTime="2026-04-16 18:08:55.977837427 +0000 UTC m=+400.836554743" watchObservedRunningTime="2026-04-16 18:08:55.97879352 +0000 UTC m=+400.837510838" Apr 16 18:09:06.966625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:06.966597 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c72vx" Apr 16 18:09:11.947910 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:11.947874 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-75krx" Apr 16 18:09:53.287230 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.287202 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mt87x"] Apr 16 18:09:53.290365 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.290349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.293036 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.293003 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:09:53.294069 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.294039 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:09:53.294161 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.294067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:09:53.294161 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.294101 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-xwf82\"" Apr 16 18:09:53.301845 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.301827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mt87x"] Apr 16 18:09:53.318556 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.318508 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-sq6jw"] Apr 16 18:09:53.321445 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.321428 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.324381 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.324361 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:09:53.324472 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.324450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vjpfs\"" Apr 16 18:09:53.331824 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.331807 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sq6jw"] Apr 16 18:09:53.376348 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.376323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.376468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.376357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2773cd95-b80a-4fff-aae0-64c92391563b-data\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.376468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.376381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2dn\" (UniqueName: \"kubernetes.io/projected/2773cd95-b80a-4fff-aae0-64c92391563b-kube-api-access-rk2dn\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.376468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.376416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7j7g\" (UniqueName: \"kubernetes.io/projected/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-kube-api-access-j7j7g\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.477412 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.477386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7j7g\" (UniqueName: \"kubernetes.io/projected/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-kube-api-access-j7j7g\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.477592 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.477433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.477592 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.477471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2773cd95-b80a-4fff-aae0-64c92391563b-data\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.477592 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.477508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2dn\" (UniqueName: \"kubernetes.io/projected/2773cd95-b80a-4fff-aae0-64c92391563b-kube-api-access-rk2dn\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.477886 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.477866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2773cd95-b80a-4fff-aae0-64c92391563b-data\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.479809 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.479786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.486129 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.486103 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2dn\" (UniqueName: \"kubernetes.io/projected/2773cd95-b80a-4fff-aae0-64c92391563b-kube-api-access-rk2dn\") pod \"seaweedfs-86cc847c5c-sq6jw\" (UID: \"2773cd95-b80a-4fff-aae0-64c92391563b\") " pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.486285 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.486267 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7j7g\" (UniqueName: \"kubernetes.io/projected/e40c940c-a76b-46f0-9dd0-d0d9e342d64a-kube-api-access-j7j7g\") pod \"llmisvc-controller-manager-68cc5db7c4-mt87x\" (UID: \"e40c940c-a76b-46f0-9dd0-d0d9e342d64a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.600295 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.600223 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:53.629643 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.629618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:53.717836 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.717803 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-mt87x"] Apr 16 18:09:53.721011 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:09:53.720971 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode40c940c_a76b_46f0_9dd0_d0d9e342d64a.slice/crio-89e3b32baca787eea6cbb0a033f6afc9aacca195d622bbe28451221c95b2379b WatchSource:0}: Error finding container 89e3b32baca787eea6cbb0a033f6afc9aacca195d622bbe28451221c95b2379b: Status 404 returned error can't find the container with id 89e3b32baca787eea6cbb0a033f6afc9aacca195d622bbe28451221c95b2379b Apr 16 18:09:53.751526 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:53.751487 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sq6jw"] Apr 16 18:09:53.753874 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:09:53.753848 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2773cd95_b80a_4fff_aae0_64c92391563b.slice/crio-a7e390194ad38cfeb5e3b03a9020e585a24ff979e61ee4dbbcbe10331fca0f8e WatchSource:0}: Error finding container a7e390194ad38cfeb5e3b03a9020e585a24ff979e61ee4dbbcbe10331fca0f8e: Status 404 returned error can't find the container with id a7e390194ad38cfeb5e3b03a9020e585a24ff979e61ee4dbbcbe10331fca0f8e Apr 16 18:09:54.114562 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:54.114530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" event={"ID":"e40c940c-a76b-46f0-9dd0-d0d9e342d64a","Type":"ContainerStarted","Data":"89e3b32baca787eea6cbb0a033f6afc9aacca195d622bbe28451221c95b2379b"} Apr 16 18:09:54.115323 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:54.115303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sq6jw" event={"ID":"2773cd95-b80a-4fff-aae0-64c92391563b","Type":"ContainerStarted","Data":"a7e390194ad38cfeb5e3b03a9020e585a24ff979e61ee4dbbcbe10331fca0f8e"} Apr 16 18:09:56.124077 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:56.124031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" event={"ID":"e40c940c-a76b-46f0-9dd0-d0d9e342d64a","Type":"ContainerStarted","Data":"a192d8fb68b48a8d9f94dbf6c98f84b83569c5578bd2fa3de19173f723982316"} Apr 16 18:09:56.124505 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:56.124267 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:09:56.140891 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:56.140822 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" podStartSLOduration=1.088195858 podStartE2EDuration="3.140805753s" podCreationTimestamp="2026-04-16 18:09:53 +0000 UTC" firstStartedPulling="2026-04-16 18:09:53.722289544 +0000 UTC m=+458.581006838" lastFinishedPulling="2026-04-16 18:09:55.774899437 +0000 UTC m=+460.633616733" observedRunningTime="2026-04-16 18:09:56.140045547 +0000 UTC m=+460.998762869" watchObservedRunningTime="2026-04-16 18:09:56.140805753 +0000 UTC m=+460.999523073" Apr 16 18:09:57.128224 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:57.128186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sq6jw" event={"ID":"2773cd95-b80a-4fff-aae0-64c92391563b","Type":"ContainerStarted","Data":"2097ad52144f746f898ac619a93124e2c27fdc01d75dc81c7c1752e97c9e3908"} Apr 16 18:09:57.145430 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:57.145385 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-sq6jw" podStartSLOduration=0.976382977 podStartE2EDuration="4.145369212s" podCreationTimestamp="2026-04-16 18:09:53 +0000 UTC" firstStartedPulling="2026-04-16 18:09:53.754969849 +0000 UTC m=+458.613687144" lastFinishedPulling="2026-04-16 18:09:56.923956084 +0000 UTC m=+461.782673379" observedRunningTime="2026-04-16 18:09:57.144289327 +0000 UTC m=+462.003006643" watchObservedRunningTime="2026-04-16 18:09:57.145369212 +0000 UTC m=+462.004086530" Apr 16 18:09:58.130911 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:58.130880 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:09:59.136038 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:09:59.136010 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-sq6jw" Apr 16 18:10:27.130821 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:10:27.130787 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-mt87x" Apr 16 18:11:01.876194 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:01.876162 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-9qdgh"] Apr 16 18:11:01.879322 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:01.879307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:01.884088 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:01.884070 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-pfntd\"" Apr 16 18:11:01.884418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:01.884398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:11:01.889984 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:01.889967 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9qdgh"] Apr 16 18:11:02.058990 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.058960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/593a3f3a-2b1a-4187-879f-c067e73cc4a3-cert\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.059149 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.059003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6vz\" (UniqueName: \"kubernetes.io/projected/593a3f3a-2b1a-4187-879f-c067e73cc4a3-kube-api-access-kq6vz\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.160201 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.160123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/593a3f3a-2b1a-4187-879f-c067e73cc4a3-cert\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.160201 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.160162 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6vz\" (UniqueName: \"kubernetes.io/projected/593a3f3a-2b1a-4187-879f-c067e73cc4a3-kube-api-access-kq6vz\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.162401 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.162384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/593a3f3a-2b1a-4187-879f-c067e73cc4a3-cert\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.171223 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.171203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6vz\" (UniqueName: \"kubernetes.io/projected/593a3f3a-2b1a-4187-879f-c067e73cc4a3-kube-api-access-kq6vz\") pod \"odh-model-controller-696fc77849-9qdgh\" (UID: \"593a3f3a-2b1a-4187-879f-c067e73cc4a3\") " pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.189565 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.189544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:02.313123 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:02.313091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-9qdgh"] Apr 16 18:11:02.315844 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:11:02.315816 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593a3f3a_2b1a_4187_879f_c067e73cc4a3.slice/crio-7c097ae788c585b4c8565f23b46a3e8de33890007bb3ed3eca9ced41d8e02d3a WatchSource:0}: Error finding container 7c097ae788c585b4c8565f23b46a3e8de33890007bb3ed3eca9ced41d8e02d3a: Status 404 returned error can't find the container with id 7c097ae788c585b4c8565f23b46a3e8de33890007bb3ed3eca9ced41d8e02d3a Apr 16 18:11:03.300926 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:03.300886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9qdgh" event={"ID":"593a3f3a-2b1a-4187-879f-c067e73cc4a3","Type":"ContainerStarted","Data":"7c097ae788c585b4c8565f23b46a3e8de33890007bb3ed3eca9ced41d8e02d3a"} Apr 16 18:11:06.310773 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:06.310739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-9qdgh" event={"ID":"593a3f3a-2b1a-4187-879f-c067e73cc4a3","Type":"ContainerStarted","Data":"b536a2b483d8f00f9a8425174175430be52f943a8efa63c7ed939d1fd77c88a2"} Apr 16 18:11:06.311150 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:06.310868 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:06.329809 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:06.329758 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-9qdgh" podStartSLOduration=2.297107223 podStartE2EDuration="5.329743425s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:02.317038692 +0000 UTC m=+527.175755987" lastFinishedPulling="2026-04-16 18:11:05.349674891 +0000 UTC m=+530.208392189" observedRunningTime="2026-04-16 18:11:06.328159376 +0000 UTC m=+531.186876694" watchObservedRunningTime="2026-04-16 18:11:06.329743425 +0000 UTC m=+531.188460784" Apr 16 18:11:17.315935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:17.315903 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-9qdgh" Apr 16 18:11:38.954384 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:38.954350 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:11:38.957621 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:38.957603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:11:38.960416 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:38.960397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:11:38.965430 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:38.965410 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:11:39.013663 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.013636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv\" (UID: \"1536598c-4a1b-408f-bcb9-be8974692c07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:11:39.114116 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.114089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv\" (UID: \"1536598c-4a1b-408f-bcb9-be8974692c07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:11:39.114430 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.114411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv\" (UID: \"1536598c-4a1b-408f-bcb9-be8974692c07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:11:39.267287 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.267226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:11:39.388483 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.388454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:11:39.390894 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:11:39.390868 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1536598c_4a1b_408f_bcb9_be8974692c07.slice/crio-6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69 WatchSource:0}: Error finding container 6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69: Status 404 returned error can't find the container with id 6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69 Apr 16 18:11:39.401437 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:39.401415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerStarted","Data":"6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69"} Apr 16 18:11:42.024440 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:42.024408 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:11:44.418625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:44.418585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerStarted","Data":"c6effdde91ef9594b107023d300c9029a674fbea4e4fb7aa9017596aa42488fb"} Apr 16 18:11:48.433887 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:48.433857 2572 generic.go:358] "Generic (PLEG): container finished" podID="1536598c-4a1b-408f-bcb9-be8974692c07" containerID="c6effdde91ef9594b107023d300c9029a674fbea4e4fb7aa9017596aa42488fb" exitCode=0 Apr 16 18:11:48.434321 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:11:48.433931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerDied","Data":"c6effdde91ef9594b107023d300c9029a674fbea4e4fb7aa9017596aa42488fb"} Apr 16 18:12:01.476855 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:01.476818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerStarted","Data":"e3deb5cc35395ed57d3284b65efceac20ec6f52947b53801bb0447dd508feee1"} Apr 16 18:12:04.487430 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:04.487397 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerStarted","Data":"182230f04aea3d2e4ec63b9736884b607345bf6a74e6175f1b22785ea9c7a3be"} Apr 16 18:12:04.487843 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:04.487644 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:12:04.488961 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:04.488934 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:04.525150 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:04.525106 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podStartSLOduration=1.909622171 podStartE2EDuration="26.52509563s" podCreationTimestamp="2026-04-16 18:11:38 +0000 UTC" firstStartedPulling="2026-04-16 18:11:39.393151668 +0000 UTC m=+564.251868976" lastFinishedPulling="2026-04-16 18:12:04.008625123 +0000 UTC m=+588.867342435" observedRunningTime="2026-04-16 18:12:04.524309976 +0000 UTC m=+589.383027293" watchObservedRunningTime="2026-04-16 18:12:04.52509563 +0000 UTC m=+589.383812947" Apr 16 18:12:05.490061 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:05.490028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:12:05.490446 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:05.490136 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:05.491188 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:05.491162 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:06.492764 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:06.492722 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:06.493223 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:06.493076 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:07.048830 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.048793 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c58ff8f8c-n7nzh" podUID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" containerName="console" containerID="cri-o://e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302" gracePeriod=15 Apr 16 18:12:07.279741 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.279720 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c58ff8f8c-n7nzh_4f2d32e9-f079-4a8a-b5e7-fb7921756978/console/0.log" Apr 16 18:12:07.279852 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.279777 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:12:07.445614 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445584 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445614 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445619 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445660 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445687 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445844 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpgd4\" (UniqueName: \"kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445994 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445938 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.445994 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.445965 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert\") pod \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\" (UID: \"4f2d32e9-f079-4a8a-b5e7-fb7921756978\") " Apr 16 18:12:07.446091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446059 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca" (OuterVolumeSpecName: "service-ca") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:07.446091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446067 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:07.446091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446078 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:07.446357 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446323 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config" (OuterVolumeSpecName: "console-config") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:07.446357 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446338 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-service-ca\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.446477 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446375 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-oauth-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.446477 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.446390 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-trusted-ca-bundle\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.448067 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.448048 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:07.448285 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.448258 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:07.448369 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.448300 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4" (OuterVolumeSpecName: "kube-api-access-xpgd4") pod "4f2d32e9-f079-4a8a-b5e7-fb7921756978" (UID: "4f2d32e9-f079-4a8a-b5e7-fb7921756978"). InnerVolumeSpecName "kube-api-access-xpgd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:07.496274 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496250 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c58ff8f8c-n7nzh_4f2d32e9-f079-4a8a-b5e7-fb7921756978/console/0.log" Apr 16 18:12:07.496634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496288 2572 generic.go:358] "Generic (PLEG): container finished" podID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" containerID="e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302" exitCode=2 Apr 16 18:12:07.496634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58ff8f8c-n7nzh" event={"ID":"4f2d32e9-f079-4a8a-b5e7-fb7921756978","Type":"ContainerDied","Data":"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302"} Apr 16 18:12:07.496634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496340 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58ff8f8c-n7nzh" event={"ID":"4f2d32e9-f079-4a8a-b5e7-fb7921756978","Type":"ContainerDied","Data":"914ef06ac9b27ba09bdd53d52db40b04fbed983703c0d8ddeef9946d3e1fd5cd"} Apr 16 18:12:07.496634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496347 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58ff8f8c-n7nzh" Apr 16 18:12:07.496634 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.496354 2572 scope.go:117] "RemoveContainer" containerID="e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302" Apr 16 18:12:07.503920 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.503896 2572 scope.go:117] "RemoveContainer" containerID="e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302" Apr 16 18:12:07.504186 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:12:07.504166 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302\": container with ID starting with e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302 not found: ID does not exist" containerID="e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302" Apr 16 18:12:07.504261 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.504192 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302"} err="failed to get container status \"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302\": rpc error: code = NotFound desc = could not find container \"e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302\": container with ID starting with e32c5ecbe582b22ac9bb168bbfc9d1ba08e2f70c9b7c593d81337612fda95302 not found: ID does not exist" Apr 16 18:12:07.517172 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.517152 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:12:07.521848 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.521827 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c58ff8f8c-n7nzh"] Apr 16 18:12:07.546792 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.546761 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.546792 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.546791 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-serving-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.546892 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.546801 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4f2d32e9-f079-4a8a-b5e7-fb7921756978-console-oauth-config\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.546892 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.546812 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpgd4\" (UniqueName: \"kubernetes.io/projected/4f2d32e9-f079-4a8a-b5e7-fb7921756978-kube-api-access-xpgd4\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.743741 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:07.743673 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" path="/var/lib/kubelet/pods/4f2d32e9-f079-4a8a-b5e7-fb7921756978/volumes" Apr 16 18:12:15.648775 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:15.648751 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:12:15.649345 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:15.649322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:12:16.493189 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:16.493131 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:16.493624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:16.493602 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:26.493676 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:26.493628 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:26.494145 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:26.494046 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:36.493400 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:36.493342 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:36.636287 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:36.493795 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:46.493020 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:46.492968 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:46.493421 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:46.493359 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:12:56.493001 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:56.492956 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:12:56.493436 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:12:56.493409 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:06.493636 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:06.493605 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:06.494049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:06.493658 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:14.101122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.101091 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:13:14.101468 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.101382 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" containerID="cri-o://e3deb5cc35395ed57d3284b65efceac20ec6f52947b53801bb0447dd508feee1" gracePeriod=30 Apr 16 18:13:14.101563 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.101454 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" containerID="cri-o://182230f04aea3d2e4ec63b9736884b607345bf6a74e6175f1b22785ea9c7a3be" gracePeriod=30 Apr 16 18:13:14.204706 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.204676 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:13:14.204968 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.204949 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" containerName="console" Apr 16 18:13:14.204968 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.204961 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" containerName="console" Apr 16 18:13:14.205058 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.205006 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f2d32e9-f079-4a8a-b5e7-fb7921756978" containerName="console" Apr 16 18:13:14.207722 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.207706 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:14.219561 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.219537 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:13:14.281284 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.281254 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:13:14.285967 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.285945 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:14.293265 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.293245 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:13:14.316650 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.316621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn\" (UID: \"a00fc0aa-212e-475f-b51f-7e194d3adfed\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:14.417886 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.417864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn\" (UID: \"a00fc0aa-212e-475f-b51f-7e194d3adfed\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:14.418012 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.417947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd\" (UID: \"0d200a49-2f38-410b-a59d-ad1612eb0a24\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:14.418213 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.418196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn\" (UID: \"a00fc0aa-212e-475f-b51f-7e194d3adfed\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:14.517672 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.517652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:14.518337 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.518272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd\" (UID: \"0d200a49-2f38-410b-a59d-ad1612eb0a24\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:14.518582 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.518566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd\" (UID: \"0d200a49-2f38-410b-a59d-ad1612eb0a24\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:14.597930 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.597900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:14.632719 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.632675 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:13:14.635072 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:13:14.635045 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00fc0aa_212e_475f_b51f_7e194d3adfed.slice/crio-36658a3c9837391e99958071c6bc1e2debf9e2f71e89d0489c196bffd771754c WatchSource:0}: Error finding container 36658a3c9837391e99958071c6bc1e2debf9e2f71e89d0489c196bffd771754c: Status 404 returned error can't find the container with id 36658a3c9837391e99958071c6bc1e2debf9e2f71e89d0489c196bffd771754c Apr 16 18:13:14.638766 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.638491 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:13:14.670268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.670238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerStarted","Data":"36658a3c9837391e99958071c6bc1e2debf9e2f71e89d0489c196bffd771754c"} Apr 16 18:13:14.719064 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:14.719044 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:13:14.721924 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:13:14.721890 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d200a49_2f38_410b_a59d_ad1612eb0a24.slice/crio-b49c63adb6c7882afc1551aae4601d582c3a57091b38d8b726a796830d522cc9 WatchSource:0}: Error finding container b49c63adb6c7882afc1551aae4601d582c3a57091b38d8b726a796830d522cc9: Status 404 returned error can't find the container with id b49c63adb6c7882afc1551aae4601d582c3a57091b38d8b726a796830d522cc9 Apr 16 18:13:15.673222 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:15.673189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerStarted","Data":"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940"} Apr 16 18:13:15.673620 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:15.673231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerStarted","Data":"b49c63adb6c7882afc1551aae4601d582c3a57091b38d8b726a796830d522cc9"} Apr 16 18:13:15.674366 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:15.674345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerStarted","Data":"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400"} Apr 16 18:13:16.493572 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:16.493509 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:13:16.493873 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:16.493844 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:18.684063 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.684035 2572 generic.go:358] "Generic (PLEG): container finished" podID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerID="f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940" exitCode=0 Apr 16 18:13:18.684409 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.684111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerDied","Data":"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940"} Apr 16 18:13:18.685501 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.685481 2572 generic.go:358] "Generic (PLEG): container finished" podID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerID="80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400" exitCode=0 Apr 16 18:13:18.685621 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.685544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerDied","Data":"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400"} Apr 16 18:13:18.687404 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.687385 2572 generic.go:358] "Generic (PLEG): container finished" podID="1536598c-4a1b-408f-bcb9-be8974692c07" containerID="e3deb5cc35395ed57d3284b65efceac20ec6f52947b53801bb0447dd508feee1" exitCode=0 Apr 16 18:13:18.687482 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:18.687422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerDied","Data":"e3deb5cc35395ed57d3284b65efceac20ec6f52947b53801bb0447dd508feee1"} Apr 16 18:13:19.692032 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:19.691990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerStarted","Data":"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea"} Apr 16 18:13:19.692504 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:19.692293 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:13:19.693812 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:19.693784 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:13:19.710745 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:19.710690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podStartSLOduration=5.710674589 podStartE2EDuration="5.710674589s" podCreationTimestamp="2026-04-16 18:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:13:19.709384881 +0000 UTC m=+664.568102199" watchObservedRunningTime="2026-04-16 18:13:19.710674589 +0000 UTC m=+664.569391907" Apr 16 18:13:20.695594 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:20.695559 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:13:26.492833 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:26.492790 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:13:26.493290 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:26.493125 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:30.695743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:30.695701 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:13:36.493047 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:36.492998 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 16 18:13:36.493456 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:36.493217 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:36.493456 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:36.493349 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:36.493560 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:36.493526 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:37.748465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:37.748433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerStarted","Data":"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd"} Apr 16 18:13:37.748807 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:37.748721 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:13:37.749700 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:37.749676 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:13:37.792903 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:37.792857 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podStartSLOduration=4.824673485 podStartE2EDuration="23.792845264s" podCreationTimestamp="2026-04-16 18:13:14 +0000 UTC" firstStartedPulling="2026-04-16 18:13:18.685444482 +0000 UTC m=+663.544161777" lastFinishedPulling="2026-04-16 18:13:37.653616258 +0000 UTC m=+682.512333556" observedRunningTime="2026-04-16 18:13:37.792131312 +0000 UTC m=+682.650848653" watchObservedRunningTime="2026-04-16 18:13:37.792845264 +0000 UTC m=+682.651562581" Apr 16 18:13:38.751945 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:38.751904 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:13:40.696151 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:40.696108 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:13:44.768993 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.768964 2572 generic.go:358] "Generic (PLEG): container finished" podID="1536598c-4a1b-408f-bcb9-be8974692c07" containerID="182230f04aea3d2e4ec63b9736884b607345bf6a74e6175f1b22785ea9c7a3be" exitCode=0 Apr 16 18:13:44.769268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.769032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerDied","Data":"182230f04aea3d2e4ec63b9736884b607345bf6a74e6175f1b22785ea9c7a3be"} Apr 16 18:13:44.769268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.769064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" event={"ID":"1536598c-4a1b-408f-bcb9-be8974692c07","Type":"ContainerDied","Data":"6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69"} Apr 16 18:13:44.769268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.769074 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff9a264a3c211bae906a882684b1e2834b2957bf426f65f46a41235b0520b69" Apr 16 18:13:44.776627 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.776611 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:44.867648 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.867622 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location\") pod \"1536598c-4a1b-408f-bcb9-be8974692c07\" (UID: \"1536598c-4a1b-408f-bcb9-be8974692c07\") " Apr 16 18:13:44.867937 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.867918 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1536598c-4a1b-408f-bcb9-be8974692c07" (UID: "1536598c-4a1b-408f-bcb9-be8974692c07"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:13:44.968210 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:44.968134 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1536598c-4a1b-408f-bcb9-be8974692c07-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:13:45.771545 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:45.771505 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv" Apr 16 18:13:45.820556 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:45.820533 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:13:45.839762 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:45.839741 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-33428-predictor-56558c5448-hcxjv"] Apr 16 18:13:47.743287 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:47.743250 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" path="/var/lib/kubelet/pods/1536598c-4a1b-408f-bcb9-be8974692c07/volumes" Apr 16 18:13:48.752451 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:48.752415 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:13:50.695641 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:50.695598 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:13:58.752715 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:13:58.752676 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:14:00.696489 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:00.696445 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:14:08.752755 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:08.752668 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:14:10.696570 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:10.696507 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:14:18.752728 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:18.752685 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:14:20.696099 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:20.696062 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:14:28.752098 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:28.752061 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:14:30.696674 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:30.696638 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:14:38.753555 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:38.753506 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:14:54.445443 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.445414 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:14:54.445857 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.445684 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" containerID="cri-o://7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea" gracePeriod=30 Apr 16 18:14:54.487254 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487226 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:14:54.487543 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487529 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="storage-initializer" Apr 16 18:14:54.487588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487546 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="storage-initializer" Apr 16 18:14:54.487588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487559 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" Apr 16 18:14:54.487588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487564 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" Apr 16 18:14:54.487588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487570 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" Apr 16 18:14:54.487588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487576 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" Apr 16 18:14:54.487730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487625 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="agent" Apr 16 18:14:54.487730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.487633 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1536598c-4a1b-408f-bcb9-be8974692c07" containerName="kserve-container" Apr 16 18:14:54.489860 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.489839 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:54.502890 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.502867 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:14:54.540900 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.540873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w\" (UID: \"4408af8f-7139-46ac-8491-97418e10936c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:54.559730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.559702 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:14:54.562156 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.562143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:54.575977 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.575955 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:14:54.641951 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.641923 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg\" (UID: \"2ba7edde-5035-45db-8f49-dd69e0cae858\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:54.642131 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.641963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w\" (UID: \"4408af8f-7139-46ac-8491-97418e10936c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:54.642252 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.642235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w\" (UID: \"4408af8f-7139-46ac-8491-97418e10936c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:54.695488 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.695421 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:14:54.695720 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.695697 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" containerID="cri-o://15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd" gracePeriod=30 Apr 16 18:14:54.742578 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.742549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg\" (UID: \"2ba7edde-5035-45db-8f49-dd69e0cae858\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:54.742870 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.742854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg\" (UID: \"2ba7edde-5035-45db-8f49-dd69e0cae858\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:54.798889 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.798859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:54.870889 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.870862 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:54.919673 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.919639 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:14:54.922344 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:14:54.922277 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4408af8f_7139_46ac_8491_97418e10936c.slice/crio-c789841be1dfedb3b17d10d333da036170f0157ce9f329bbdb97988d89d5ac39 WatchSource:0}: Error finding container c789841be1dfedb3b17d10d333da036170f0157ce9f329bbdb97988d89d5ac39: Status 404 returned error can't find the container with id c789841be1dfedb3b17d10d333da036170f0157ce9f329bbdb97988d89d5ac39 Apr 16 18:14:54.960874 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.960850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerStarted","Data":"c789841be1dfedb3b17d10d333da036170f0157ce9f329bbdb97988d89d5ac39"} Apr 16 18:14:54.995955 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:54.995931 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:14:54.998749 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:14:54.998724 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ba7edde_5035_45db_8f49_dd69e0cae858.slice/crio-eb721218d139b3fc40573312c95ae0ec505b1dc4f414455286865dc30e45460c WatchSource:0}: Error finding container eb721218d139b3fc40573312c95ae0ec505b1dc4f414455286865dc30e45460c: Status 404 returned error can't find the container with id eb721218d139b3fc40573312c95ae0ec505b1dc4f414455286865dc30e45460c Apr 16 18:14:55.964387 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:55.964348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerStarted","Data":"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c"} Apr 16 18:14:55.965595 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:55.965573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerStarted","Data":"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6"} Apr 16 18:14:55.965710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:55.965599 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerStarted","Data":"eb721218d139b3fc40573312c95ae0ec505b1dc4f414455286865dc30e45460c"} Apr 16 18:14:58.230091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.230067 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:14:58.271994 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.271933 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location\") pod \"0d200a49-2f38-410b-a59d-ad1612eb0a24\" (UID: \"0d200a49-2f38-410b-a59d-ad1612eb0a24\") " Apr 16 18:14:58.272227 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.272205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d200a49-2f38-410b-a59d-ad1612eb0a24" (UID: "0d200a49-2f38-410b-a59d-ad1612eb0a24"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:58.372548 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.372504 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d200a49-2f38-410b-a59d-ad1612eb0a24-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:14:58.472452 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.472434 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:14:58.573814 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.573736 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location\") pod \"a00fc0aa-212e-475f-b51f-7e194d3adfed\" (UID: \"a00fc0aa-212e-475f-b51f-7e194d3adfed\") " Apr 16 18:14:58.574060 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.574037 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a00fc0aa-212e-475f-b51f-7e194d3adfed" (UID: "a00fc0aa-212e-475f-b51f-7e194d3adfed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:58.674253 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.674222 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a00fc0aa-212e-475f-b51f-7e194d3adfed-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:14:58.975557 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.975526 2572 generic.go:358] "Generic (PLEG): container finished" podID="4408af8f-7139-46ac-8491-97418e10936c" containerID="b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c" exitCode=0 Apr 16 18:14:58.975693 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.975591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerDied","Data":"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c"} Apr 16 18:14:58.976910 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.976887 2572 generic.go:358] "Generic (PLEG): container finished" podID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerID="f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6" exitCode=0 Apr 16 18:14:58.976999 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.976936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerDied","Data":"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6"} Apr 16 18:14:58.978390 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.978372 2572 generic.go:358] "Generic (PLEG): container finished" podID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerID="15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd" exitCode=0 Apr 16 18:14:58.978465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.978446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerDied","Data":"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd"} Apr 16 18:14:58.978465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.978450 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" Apr 16 18:14:58.978465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.978464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd" event={"ID":"0d200a49-2f38-410b-a59d-ad1612eb0a24","Type":"ContainerDied","Data":"b49c63adb6c7882afc1551aae4601d582c3a57091b38d8b726a796830d522cc9"} Apr 16 18:14:58.978658 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.978478 2572 scope.go:117] "RemoveContainer" containerID="15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd" Apr 16 18:14:58.980321 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.980294 2572 generic.go:358] "Generic (PLEG): container finished" podID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerID="7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea" exitCode=0 Apr 16 18:14:58.980374 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.980336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerDied","Data":"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea"} Apr 16 18:14:58.980374 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.980360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" event={"ID":"a00fc0aa-212e-475f-b51f-7e194d3adfed","Type":"ContainerDied","Data":"36658a3c9837391e99958071c6bc1e2debf9e2f71e89d0489c196bffd771754c"} Apr 16 18:14:58.980443 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.980429 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn" Apr 16 18:14:58.990016 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.989945 2572 scope.go:117] "RemoveContainer" containerID="f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940" Apr 16 18:14:58.998364 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.998318 2572 scope.go:117] "RemoveContainer" containerID="15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd" Apr 16 18:14:58.998737 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:14:58.998716 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd\": container with ID starting with 15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd not found: ID does not exist" containerID="15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd" Apr 16 18:14:58.998816 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.998743 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd"} err="failed to get container status \"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd\": rpc error: code = NotFound desc = could not find container \"15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd\": container with ID starting with 15d892b4afe9c44ab5f2561680724e3fb4f1b5d5442da0b22a65f8e3958dbecd not found: ID does not exist" Apr 16 18:14:58.998816 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.998759 2572 scope.go:117] "RemoveContainer" containerID="f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940" Apr 16 18:14:58.999020 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:14:58.999003 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940\": container with ID starting with f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940 not found: ID does not exist" containerID="f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940" Apr 16 18:14:58.999078 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.999023 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940"} err="failed to get container status \"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940\": rpc error: code = NotFound desc = could not find container \"f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940\": container with ID starting with f5129ca79a3798ba8d4e25808514eea668df344ef79c48b46206ff9dab952940 not found: ID does not exist" Apr 16 18:14:58.999078 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:58.999037 2572 scope.go:117] "RemoveContainer" containerID="7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea" Apr 16 18:14:59.005713 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.005696 2572 scope.go:117] "RemoveContainer" containerID="80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400" Apr 16 18:14:59.013949 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.013925 2572 scope.go:117] "RemoveContainer" containerID="7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea" Apr 16 18:14:59.014218 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:14:59.014165 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea\": container with ID starting with 7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea not found: ID does not exist" containerID="7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea" Apr 16 18:14:59.014310 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.014223 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea"} err="failed to get container status \"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea\": rpc error: code = NotFound desc = could not find container \"7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea\": container with ID starting with 7123ad8a169e93357012e1656aaeaa0e9794c391da98f94e98a0a9efc93f0aea not found: ID does not exist" Apr 16 18:14:59.014310 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.014242 2572 scope.go:117] "RemoveContainer" containerID="80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400" Apr 16 18:14:59.014488 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:14:59.014470 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400\": container with ID starting with 80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400 not found: ID does not exist" containerID="80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400" Apr 16 18:14:59.014588 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.014488 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400"} err="failed to get container status \"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400\": rpc error: code = NotFound desc = could not find container \"80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400\": container with ID starting with 80e238c6462b6a45961c4d9f940c0ba553beab6e70ee736c529214f5ae7b2400 not found: ID does not exist" Apr 16 18:14:59.031280 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.031261 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:14:59.035239 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.035219 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-bfe16-predictor-c9df9bc-s4fdd"] Apr 16 18:14:59.050386 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.050366 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:14:59.055057 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.055039 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-bfe16-predictor-595d5dd96b-b74dn"] Apr 16 18:14:59.743635 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.743604 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" path="/var/lib/kubelet/pods/0d200a49-2f38-410b-a59d-ad1612eb0a24/volumes" Apr 16 18:14:59.743971 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.743949 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" path="/var/lib/kubelet/pods/a00fc0aa-212e-475f-b51f-7e194d3adfed/volumes" Apr 16 18:14:59.986403 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.986369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerStarted","Data":"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51"} Apr 16 18:14:59.986698 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.986669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:14:59.988027 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.987989 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:14:59.988123 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.988049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerStarted","Data":"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f"} Apr 16 18:14:59.988322 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.988304 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:14:59.989152 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:14:59.989131 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:00.004836 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:00.004764 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podStartSLOduration=6.004753869 podStartE2EDuration="6.004753869s" podCreationTimestamp="2026-04-16 18:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:00.003632543 +0000 UTC m=+764.862349872" watchObservedRunningTime="2026-04-16 18:15:00.004753869 +0000 UTC m=+764.863471185" Apr 16 18:15:00.020310 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:00.020273 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podStartSLOduration=6.020260942 podStartE2EDuration="6.020260942s" podCreationTimestamp="2026-04-16 18:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:00.018791325 +0000 UTC m=+764.877508655" watchObservedRunningTime="2026-04-16 18:15:00.020260942 +0000 UTC m=+764.878978259" Apr 16 18:15:00.991683 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:00.991645 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:00.992027 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:00.991644 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:10.992051 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:10.992008 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:10.992377 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:10.992010 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:20.992507 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:20.992459 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:20.992883 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:20.992459 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:30.991781 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:30.991736 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:30.992221 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:30.991736 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:40.991624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:40.991586 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:15:40.992008 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:40.991751 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:50.992017 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:50.991974 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 16 18:15:50.992365 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:15:50.991974 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 16 18:16:00.992692 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:00.992657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:16:00.993160 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:00.992847 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:16:24.712178 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:24.712145 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:16:24.712636 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:24.712396 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" containerID="cri-o://f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51" gracePeriod=30 Apr 16 18:16:24.819974 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:24.819945 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:16:24.820208 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:24.820189 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" containerID="cri-o://9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f" gracePeriod=30 Apr 16 18:16:27.965734 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:27.965706 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:16:27.999152 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:27.999128 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location\") pod \"2ba7edde-5035-45db-8f49-dd69e0cae858\" (UID: \"2ba7edde-5035-45db-8f49-dd69e0cae858\") " Apr 16 18:16:27.999433 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:27.999414 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ba7edde-5035-45db-8f49-dd69e0cae858" (UID: "2ba7edde-5035-45db-8f49-dd69e0cae858"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:16:28.099731 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.099666 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ba7edde-5035-45db-8f49-dd69e0cae858-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:16:28.231188 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.231157 2572 generic.go:358] "Generic (PLEG): container finished" podID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerID="9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f" exitCode=0 Apr 16 18:16:28.231327 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.231221 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" Apr 16 18:16:28.231327 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.231238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerDied","Data":"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f"} Apr 16 18:16:28.231327 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.231276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg" event={"ID":"2ba7edde-5035-45db-8f49-dd69e0cae858","Type":"ContainerDied","Data":"eb721218d139b3fc40573312c95ae0ec505b1dc4f414455286865dc30e45460c"} Apr 16 18:16:28.231327 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.231291 2572 scope.go:117] "RemoveContainer" containerID="9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f" Apr 16 18:16:28.239228 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.239198 2572 scope.go:117] "RemoveContainer" containerID="f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6" Apr 16 18:16:28.245868 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.245853 2572 scope.go:117] "RemoveContainer" containerID="9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f" Apr 16 18:16:28.246097 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:16:28.246079 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f\": container with ID starting with 9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f not found: ID does not exist" containerID="9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f" Apr 16 18:16:28.246165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.246108 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f"} err="failed to get container status \"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f\": rpc error: code = NotFound desc = could not find container \"9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f\": container with ID starting with 9ecf951dd581142585432777d856052cb21dfa767193849c8b06d06d9ff6c13f not found: ID does not exist" Apr 16 18:16:28.246165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.246128 2572 scope.go:117] "RemoveContainer" containerID="f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6" Apr 16 18:16:28.246346 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:16:28.246330 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6\": container with ID starting with f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6 not found: ID does not exist" containerID="f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6" Apr 16 18:16:28.246382 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.246354 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6"} err="failed to get container status \"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6\": rpc error: code = NotFound desc = could not find container \"f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6\": container with ID starting with f1f84c466a258124036253f62d91437bfa257f2bbabd9801619749d6f857e7a6 not found: ID does not exist" Apr 16 18:16:28.253572 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.253554 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:16:28.256624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.256604 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-7f673-predictor-96dccc47f-4jqxg"] Apr 16 18:16:28.644109 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.644087 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:16:28.703167 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.703141 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location\") pod \"4408af8f-7139-46ac-8491-97418e10936c\" (UID: \"4408af8f-7139-46ac-8491-97418e10936c\") " Apr 16 18:16:28.703385 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.703366 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4408af8f-7139-46ac-8491-97418e10936c" (UID: "4408af8f-7139-46ac-8491-97418e10936c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:16:28.804065 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:28.804019 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4408af8f-7139-46ac-8491-97418e10936c-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:16:29.235697 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.235660 2572 generic.go:358] "Generic (PLEG): container finished" podID="4408af8f-7139-46ac-8491-97418e10936c" containerID="f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51" exitCode=0 Apr 16 18:16:29.236122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.235721 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" Apr 16 18:16:29.236122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.235728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerDied","Data":"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51"} Apr 16 18:16:29.236122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.235754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w" event={"ID":"4408af8f-7139-46ac-8491-97418e10936c","Type":"ContainerDied","Data":"c789841be1dfedb3b17d10d333da036170f0157ce9f329bbdb97988d89d5ac39"} Apr 16 18:16:29.236122 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.235777 2572 scope.go:117] "RemoveContainer" containerID="f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51" Apr 16 18:16:29.244302 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.244280 2572 scope.go:117] "RemoveContainer" containerID="b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c" Apr 16 18:16:29.250456 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.250441 2572 scope.go:117] "RemoveContainer" containerID="f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51" Apr 16 18:16:29.250681 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:16:29.250663 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51\": container with ID starting with f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51 not found: ID does not exist" containerID="f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51" Apr 16 18:16:29.250743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.250686 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51"} err="failed to get container status \"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51\": rpc error: code = NotFound desc = could not find container \"f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51\": container with ID starting with f966ba2b1da8b8b822f1e70dcde3d10bc6e6949f9fddfa62d766964159213c51 not found: ID does not exist" Apr 16 18:16:29.250743 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.250701 2572 scope.go:117] "RemoveContainer" containerID="b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c" Apr 16 18:16:29.250921 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:16:29.250906 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c\": container with ID starting with b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c not found: ID does not exist" containerID="b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c" Apr 16 18:16:29.250957 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.250924 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c"} err="failed to get container status \"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c\": rpc error: code = NotFound desc = could not find container \"b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c\": container with ID starting with b15e2e4966b2b771ee0353f2a6b0b9a7a561bd24c3d1aad8d3fdc90ee6c4926c not found: ID does not exist" Apr 16 18:16:29.256419 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.256396 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:16:29.262187 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.262165 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-7f673-predictor-778ff59858-jwm5w"] Apr 16 18:16:29.743046 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.743016 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" path="/var/lib/kubelet/pods/2ba7edde-5035-45db-8f49-dd69e0cae858/volumes" Apr 16 18:16:29.743358 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:29.743345 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4408af8f-7139-46ac-8491-97418e10936c" path="/var/lib/kubelet/pods/4408af8f-7139-46ac-8491-97418e10936c/volumes" Apr 16 18:16:34.787824 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.787794 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788089 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788102 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788112 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788118 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788131 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788136 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788143 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788148 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788154 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788159 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788169 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788174 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788181 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="storage-initializer" Apr 16 18:16:34.788182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788186 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="storage-initializer" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788191 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788196 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788236 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ba7edde-5035-45db-8f49-dd69e0cae858" containerName="kserve-container" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788244 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d200a49-2f38-410b-a59d-ad1612eb0a24" containerName="kserve-container" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788251 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4408af8f-7139-46ac-8491-97418e10936c" containerName="kserve-container" Apr 16 18:16:34.788623 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.788257 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a00fc0aa-212e-475f-b51f-7e194d3adfed" containerName="kserve-container" Apr 16 18:16:34.792768 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.792750 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:34.796224 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.796199 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:16:34.800411 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.800389 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:16:34.847931 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.847903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location\") pod \"isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw\" (UID: \"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d\") " pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:34.948245 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.948218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location\") pod \"isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw\" (UID: \"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d\") " pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:34.948582 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:34.948563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location\") pod \"isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw\" (UID: \"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d\") " pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:35.103479 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:35.103447 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:35.221534 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:35.221495 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:16:35.223656 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:16:35.223621 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac6194d_a1ff_4c4e_bd3c_7c144f105a8d.slice/crio-cd6a4f5dd52c6e9a62a97e544872bc5cb9f1ae50a3f146ea99dbb05747555b43 WatchSource:0}: Error finding container cd6a4f5dd52c6e9a62a97e544872bc5cb9f1ae50a3f146ea99dbb05747555b43: Status 404 returned error can't find the container with id cd6a4f5dd52c6e9a62a97e544872bc5cb9f1ae50a3f146ea99dbb05747555b43 Apr 16 18:16:35.254569 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:35.254538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerStarted","Data":"cd6a4f5dd52c6e9a62a97e544872bc5cb9f1ae50a3f146ea99dbb05747555b43"} Apr 16 18:16:36.258084 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:36.258053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerStarted","Data":"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e"} Apr 16 18:16:39.266625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:39.266537 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerID="08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e" exitCode=0 Apr 16 18:16:39.266625 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:39.266598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerDied","Data":"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e"} Apr 16 18:16:40.270963 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:40.270927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerStarted","Data":"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563"} Apr 16 18:16:40.271285 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:40.270973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerStarted","Data":"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915"} Apr 16 18:16:40.271285 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:40.271257 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:40.272385 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:40.272359 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:16:40.292935 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:40.292885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podStartSLOduration=6.29286859 podStartE2EDuration="6.29286859s" podCreationTimestamp="2026-04-16 18:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:40.290863823 +0000 UTC m=+865.149581141" watchObservedRunningTime="2026-04-16 18:16:40.29286859 +0000 UTC m=+865.151585908" Apr 16 18:16:41.274004 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:41.273973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:16:41.274428 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:41.274100 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:16:41.275016 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:41.274992 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:42.276858 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:42.276815 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:16:42.277299 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:42.277092 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:52.277228 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:52.277175 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:16:52.277677 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:16:52.277653 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:02.277526 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:02.277473 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:17:02.278047 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:02.277896 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:12.277724 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:12.277625 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:17:12.278187 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:12.278029 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:15.667227 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:15.667199 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:17:15.668222 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:15.668202 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:17:22.276872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:22.276828 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:17:22.277335 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:22.277239 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:32.277470 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:32.277420 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:17:32.277873 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:32.277828 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:42.277435 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:42.277407 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:17:42.279833 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:42.277596 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:17:49.977711 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:49.977678 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:17:49.978143 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:49.978086 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" containerID="cri-o://11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915" gracePeriod=30 Apr 16 18:17:49.978233 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:49.978182 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" containerID="cri-o://ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563" gracePeriod=30 Apr 16 18:17:50.039931 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.039901 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:17:50.045954 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.045931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:50.051214 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.051192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:17:50.188375 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.188345 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6\" (UID: \"84cd0f10-5995-4e4a-a117-686fcb6e6cc7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:50.289161 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.289084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6\" (UID: \"84cd0f10-5995-4e4a-a117-686fcb6e6cc7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:50.289409 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.289393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6\" (UID: \"84cd0f10-5995-4e4a-a117-686fcb6e6cc7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:50.356184 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.356159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:50.468283 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:50.468261 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:17:51.468354 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:51.468318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerStarted","Data":"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e"} Apr 16 18:17:51.468354 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:51.468351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerStarted","Data":"ac17e0c47e34e00bff04fe6e3c264315062a78cbbb3bed902bf8381bc965f16d"} Apr 16 18:17:52.277579 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:52.277537 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:17:52.277917 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:52.277894 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:54.477798 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:54.477724 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerID="11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915" exitCode=0 Apr 16 18:17:54.478186 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:54.477798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerDied","Data":"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915"} Apr 16 18:17:54.479049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:54.479023 2572 generic.go:358] "Generic (PLEG): container finished" podID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerID="2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e" exitCode=0 Apr 16 18:17:54.479156 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:54.479062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerDied","Data":"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e"} Apr 16 18:17:55.483289 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:55.483256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerStarted","Data":"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059"} Apr 16 18:17:55.483733 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:55.483632 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:17:55.484637 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:55.484611 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:17:55.501936 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:55.501877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podStartSLOduration=5.501853005 podStartE2EDuration="5.501853005s" podCreationTimestamp="2026-04-16 18:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:55.501114571 +0000 UTC m=+940.359831888" watchObservedRunningTime="2026-04-16 18:17:55.501853005 +0000 UTC m=+940.360570323" Apr 16 18:17:56.486117 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:17:56.486083 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:02.277049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:02.277002 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:18:02.277450 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:02.277279 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:18:06.486710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:06.486670 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:12.277398 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:12.277352 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:18:12.277936 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:12.277477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:18:12.277936 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:12.277764 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:18:12.277936 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:12.277872 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:18:15.696777 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:15.696746 2572 scope.go:117] "RemoveContainer" containerID="c6effdde91ef9594b107023d300c9029a674fbea4e4fb7aa9017596aa42488fb" Apr 16 18:18:15.703985 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:15.703968 2572 scope.go:117] "RemoveContainer" containerID="182230f04aea3d2e4ec63b9736884b607345bf6a74e6175f1b22785ea9c7a3be" Apr 16 18:18:15.710503 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:15.710486 2572 scope.go:117] "RemoveContainer" containerID="e3deb5cc35395ed57d3284b65efceac20ec6f52947b53801bb0447dd508feee1" Apr 16 18:18:16.486031 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:16.485994 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:20.151017 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.150996 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:18:20.202385 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.202356 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location\") pod \"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d\" (UID: \"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d\") " Apr 16 18:18:20.202665 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.202643 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" (UID: "4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:18:20.302876 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.302800 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:18:20.550842 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.550804 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerID="ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563" exitCode=137 Apr 16 18:18:20.550978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.550848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerDied","Data":"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563"} Apr 16 18:18:20.550978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.550881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" event={"ID":"4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d","Type":"ContainerDied","Data":"cd6a4f5dd52c6e9a62a97e544872bc5cb9f1ae50a3f146ea99dbb05747555b43"} Apr 16 18:18:20.550978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.550885 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw" Apr 16 18:18:20.550978 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.550901 2572 scope.go:117] "RemoveContainer" containerID="ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563" Apr 16 18:18:20.560369 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.560352 2572 scope.go:117] "RemoveContainer" containerID="11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915" Apr 16 18:18:20.566838 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.566821 2572 scope.go:117] "RemoveContainer" containerID="08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e" Apr 16 18:18:20.573169 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573153 2572 scope.go:117] "RemoveContainer" containerID="ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563" Apr 16 18:18:20.573356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573334 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:18:20.573462 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:18:20.573382 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563\": container with ID starting with ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563 not found: ID does not exist" containerID="ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563" Apr 16 18:18:20.573462 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573412 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563"} err="failed to get container status \"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563\": rpc error: code = NotFound desc = could not find container \"ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563\": container with ID starting with ef541512d2c9f21ab6120d9ec13342855988efa76f5ab72ca5fad28d75bfe563 not found: ID does not exist" Apr 16 18:18:20.573462 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573433 2572 scope.go:117] "RemoveContainer" containerID="11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915" Apr 16 18:18:20.573688 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:18:20.573671 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915\": container with ID starting with 11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915 not found: ID does not exist" containerID="11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915" Apr 16 18:18:20.573759 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573697 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915"} err="failed to get container status \"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915\": rpc error: code = NotFound desc = could not find container \"11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915\": container with ID starting with 11063969687be7bf558271efd6b1a85ae59684d5b987361dce02988e434ba915 not found: ID does not exist" Apr 16 18:18:20.573759 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573714 2572 scope.go:117] "RemoveContainer" containerID="08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e" Apr 16 18:18:20.573948 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:18:20.573932 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e\": container with ID starting with 08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e not found: ID does not exist" containerID="08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e" Apr 16 18:18:20.573983 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.573954 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e"} err="failed to get container status \"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e\": rpc error: code = NotFound desc = could not find container \"08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e\": container with ID starting with 08026fcf42a1402be7a918617dbd6cdb74d1879918b14b275cc32c51bfab2d0e not found: ID does not exist" Apr 16 18:18:20.577187 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:20.577167 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-28c03-predictor-6fb4bfb79f-dqltw"] Apr 16 18:18:21.742786 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:21.742755 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" path="/var/lib/kubelet/pods/4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d/volumes" Apr 16 18:18:26.486621 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:26.486580 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:36.486377 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:36.486289 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:46.486910 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:46.486869 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:18:56.486553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:18:56.486496 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:06.486393 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:06.486356 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:16.486573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:16.486533 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:26.486338 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:26.486298 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:26.740582 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:26.740474 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:36.741172 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:36.741136 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:46.740971 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:46.740931 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:19:56.741309 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:19:56.741271 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:20:06.742221 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:06.742148 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:20:10.220813 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.220784 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:20:10.221188 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.220995 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" containerID="cri-o://ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059" gracePeriod=30 Apr 16 18:20:10.298168 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298136 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:20:10.298453 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298437 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" Apr 16 18:20:10.298554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298455 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" Apr 16 18:20:10.298554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298487 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" Apr 16 18:20:10.298554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298496 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" Apr 16 18:20:10.298554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298507 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="storage-initializer" Apr 16 18:20:10.298554 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298531 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="storage-initializer" Apr 16 18:20:10.298787 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298600 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="kserve-container" Apr 16 18:20:10.298787 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.298615 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ac6194d-a1ff-4c4e-bd3c-7c144f105a8d" containerName="agent" Apr 16 18:20:10.301380 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.301361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:10.314420 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.314399 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:20:10.356241 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.356221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location\") pod \"isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk\" (UID: \"34cb1cf8-1390-4752-8384-de949db66371\") " pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:10.457473 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.457449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location\") pod \"isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk\" (UID: \"34cb1cf8-1390-4752-8384-de949db66371\") " pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:10.457784 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.457767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location\") pod \"isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk\" (UID: \"34cb1cf8-1390-4752-8384-de949db66371\") " pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:10.615326 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.615298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:10.739077 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.739052 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:20:10.741264 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:20:10.741235 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cb1cf8_1390_4752_8384_de949db66371.slice/crio-4350ae413013b8d290067e353e8b84ae4f2dd6d869c54850c1373d842d891618 WatchSource:0}: Error finding container 4350ae413013b8d290067e353e8b84ae4f2dd6d869c54850c1373d842d891618: Status 404 returned error can't find the container with id 4350ae413013b8d290067e353e8b84ae4f2dd6d869c54850c1373d842d891618 Apr 16 18:20:10.743055 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.743039 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:20:10.842418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.842391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerStarted","Data":"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a"} Apr 16 18:20:10.842531 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:10.842423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerStarted","Data":"4350ae413013b8d290067e353e8b84ae4f2dd6d869c54850c1373d842d891618"} Apr 16 18:20:14.853370 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:14.853337 2572 generic.go:358] "Generic (PLEG): container finished" podID="34cb1cf8-1390-4752-8384-de949db66371" containerID="b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a" exitCode=0 Apr 16 18:20:14.853738 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:14.853408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerDied","Data":"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a"} Apr 16 18:20:15.857072 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:15.857035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerStarted","Data":"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729"} Apr 16 18:20:15.857527 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:15.857351 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:20:15.858822 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:15.858796 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:20:15.874034 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:15.873993 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podStartSLOduration=5.873979946 podStartE2EDuration="5.873979946s" podCreationTimestamp="2026-04-16 18:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:15.873276616 +0000 UTC m=+1080.731993933" watchObservedRunningTime="2026-04-16 18:20:15.873979946 +0000 UTC m=+1080.732697261" Apr 16 18:20:16.741159 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:16.741121 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 18:20:16.860200 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:16.860167 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:20:18.552974 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.552948 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:20:18.610889 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.610863 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location\") pod \"84cd0f10-5995-4e4a-a117-686fcb6e6cc7\" (UID: \"84cd0f10-5995-4e4a-a117-686fcb6e6cc7\") " Apr 16 18:20:18.611163 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.611140 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84cd0f10-5995-4e4a-a117-686fcb6e6cc7" (UID: "84cd0f10-5995-4e4a-a117-686fcb6e6cc7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:18.711886 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.711831 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84cd0f10-5995-4e4a-a117-686fcb6e6cc7-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:20:18.866660 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.866627 2572 generic.go:358] "Generic (PLEG): container finished" podID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerID="ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059" exitCode=0 Apr 16 18:20:18.866780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.866676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerDied","Data":"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059"} Apr 16 18:20:18.866780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.866687 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" Apr 16 18:20:18.866780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.866700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6" event={"ID":"84cd0f10-5995-4e4a-a117-686fcb6e6cc7","Type":"ContainerDied","Data":"ac17e0c47e34e00bff04fe6e3c264315062a78cbbb3bed902bf8381bc965f16d"} Apr 16 18:20:18.866780 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.866717 2572 scope.go:117] "RemoveContainer" containerID="ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059" Apr 16 18:20:18.875447 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.874498 2572 scope.go:117] "RemoveContainer" containerID="2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e" Apr 16 18:20:18.882505 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.882489 2572 scope.go:117] "RemoveContainer" containerID="ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059" Apr 16 18:20:18.882749 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:20:18.882732 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059\": container with ID starting with ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059 not found: ID does not exist" containerID="ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059" Apr 16 18:20:18.882791 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.882758 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059"} err="failed to get container status \"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059\": rpc error: code = NotFound desc = could not find container \"ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059\": container with ID starting with ba1f04ea2d415a4de1773c69836ea1b205ffaedbe78762a1ad6b3b085adbf059 not found: ID does not exist" Apr 16 18:20:18.882791 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.882774 2572 scope.go:117] "RemoveContainer" containerID="2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e" Apr 16 18:20:18.882986 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:20:18.882966 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e\": container with ID starting with 2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e not found: ID does not exist" containerID="2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e" Apr 16 18:20:18.883027 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.882992 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e"} err="failed to get container status \"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e\": rpc error: code = NotFound desc = could not find container \"2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e\": container with ID starting with 2bfc4672b16e71de83d1efc9fcf0fdb331d52a58a810707f5508c89ae2a3b08e not found: ID does not exist" Apr 16 18:20:18.888481 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.888456 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:20:18.891333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:18.891312 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-4d078-predictor-6fc7dc58dd-49fs6"] Apr 16 18:20:19.743197 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:19.743158 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" path="/var/lib/kubelet/pods/84cd0f10-5995-4e4a-a117-686fcb6e6cc7/volumes" Apr 16 18:20:26.860529 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:26.860483 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:20:36.860849 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:36.860805 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:20:46.860949 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:46.860905 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:20:56.860406 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:20:56.860365 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:21:06.860797 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:06.860757 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:21:16.861480 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:16.861452 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:21:20.441601 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441565 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:20.442049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441823 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" Apr 16 18:21:20.442049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441835 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" Apr 16 18:21:20.442049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441846 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="storage-initializer" Apr 16 18:21:20.442049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441851 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="storage-initializer" Apr 16 18:21:20.442049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.441898 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cd0f10-5995-4e4a-a117-686fcb6e6cc7" containerName="kserve-container" Apr 16 18:21:20.445133 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.445118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.448021 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.448000 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-e1f7b3\"" Apr 16 18:21:20.448137 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.448022 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 18:21:20.448137 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.448085 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-e1f7b3-dockercfg-nkp4s\"" Apr 16 18:21:20.453700 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.453681 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:20.528535 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.528491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.528632 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.528566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.628966 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.628940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.629070 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.628994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.629316 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.629301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.629476 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.629461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert\") pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.754830 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.754772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:20.872189 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:20.872166 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:20.874367 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:21:20.874329 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd991a4e_5318_480b_8567_9b37e2860f56.slice/crio-24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99 WatchSource:0}: Error finding container 24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99: Status 404 returned error can't find the container with id 24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99 Apr 16 18:21:21.028571 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:21.028488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerStarted","Data":"6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a"} Apr 16 18:21:21.028571 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:21.028539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerStarted","Data":"24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99"} Apr 16 18:21:24.037531 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:24.037493 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/0.log" Apr 16 18:21:24.037815 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:24.037548 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd991a4e-5318-480b-8567-9b37e2860f56" containerID="6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a" exitCode=1 Apr 16 18:21:24.037815 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:24.037609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerDied","Data":"6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a"} Apr 16 18:21:25.041785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:25.041757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/0.log" Apr 16 18:21:25.042179 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:25.041841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerStarted","Data":"5b13cb327f6a871355a764d51c29324b96eaf8a7e3f90e4b2c8f0398f42494bf"} Apr 16 18:21:30.055933 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.055857 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/1.log" Apr 16 18:21:30.056259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.056198 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/0.log" Apr 16 18:21:30.056259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.056230 2572 generic.go:358] "Generic (PLEG): container finished" podID="fd991a4e-5318-480b-8567-9b37e2860f56" containerID="5b13cb327f6a871355a764d51c29324b96eaf8a7e3f90e4b2c8f0398f42494bf" exitCode=1 Apr 16 18:21:30.056333 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.056299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerDied","Data":"5b13cb327f6a871355a764d51c29324b96eaf8a7e3f90e4b2c8f0398f42494bf"} Apr 16 18:21:30.056370 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.056340 2572 scope.go:117] "RemoveContainer" containerID="6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a" Apr 16 18:21:30.056683 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:30.056666 2572 scope.go:117] "RemoveContainer" containerID="6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a" Apr 16 18:21:30.065698 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:30.065674 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_kserve-ci-e2e-test_fd991a4e-5318-480b-8567-9b37e2860f56_0 in pod sandbox 24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99 from index: no such id: '6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a'" containerID="6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a" Apr 16 18:21:30.065756 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:30.065715 2572 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_kserve-ci-e2e-test_fd991a4e-5318-480b-8567-9b37e2860f56_0 in pod sandbox 24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99 from index: no such id: '6b10843efe70783208aee62e3fdc0cb70c5018903e7990b5ee50ef5c0aa4926a'; Skipping pod \"isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_kserve-ci-e2e-test(fd991a4e-5318-480b-8567-9b37e2860f56)\"" logger="UnhandledError" Apr 16 18:21:30.067011 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:30.066992 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_kserve-ci-e2e-test(fd991a4e-5318-480b-8567-9b37e2860f56)\"" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" Apr 16 18:21:31.060293 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:31.060270 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/1.log" Apr 16 18:21:34.541205 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.541130 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:34.608730 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.608705 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:21:34.608967 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.608946 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" containerID="cri-o://91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729" gracePeriod=30 Apr 16 18:21:34.656005 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.655980 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:34.660389 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.660369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.662906 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.662887 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-89c2e8\"" Apr 16 18:21:34.663008 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.662988 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-89c2e8-dockercfg-pddkc\"" Apr 16 18:21:34.669746 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.669724 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:34.672197 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.672179 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/1.log" Apr 16 18:21:34.672281 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.672242 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:34.736881 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.736860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.736962 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.736890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.837380 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837322 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert\") pod \"fd991a4e-5318-480b-8567-9b37e2860f56\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " Apr 16 18:21:34.837380 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837366 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location\") pod \"fd991a4e-5318-480b-8567-9b37e2860f56\" (UID: \"fd991a4e-5318-480b-8567-9b37e2860f56\") " Apr 16 18:21:34.837507 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.837608 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.837710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837649 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd991a4e-5318-480b-8567-9b37e2860f56" (UID: "fd991a4e-5318-480b-8567-9b37e2860f56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:34.837710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837653 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fd991a4e-5318-480b-8567-9b37e2860f56" (UID: "fd991a4e-5318-480b-8567-9b37e2860f56"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:34.837871 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837771 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd991a4e-5318-480b-8567-9b37e2860f56-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:21:34.837871 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837790 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fd991a4e-5318-480b-8567-9b37e2860f56-cabundle-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:21:34.837871 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.837831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.838108 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.838089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert\") pod \"isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:34.981600 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:34.981575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:35.073490 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.073465 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq_fd991a4e-5318-480b-8567-9b37e2860f56/storage-initializer/1.log" Apr 16 18:21:35.073753 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.073569 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" event={"ID":"fd991a4e-5318-480b-8567-9b37e2860f56","Type":"ContainerDied","Data":"24372f6b5ca8f4581e9f5d65a01cc419b484c7c241d994ff6422125d44d10b99"} Apr 16 18:21:35.073753 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.073604 2572 scope.go:117] "RemoveContainer" containerID="5b13cb327f6a871355a764d51c29324b96eaf8a7e3f90e4b2c8f0398f42494bf" Apr 16 18:21:35.073753 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.073693 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq" Apr 16 18:21:35.094178 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.094078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:35.096411 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:21:35.096392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5ad73b_bee4_4f01_9d1b_7a8e70a812e0.slice/crio-38d760c3a26f976e21b9ac112a59cb360920de7eb48c012f80f64397d0c21e31 WatchSource:0}: Error finding container 38d760c3a26f976e21b9ac112a59cb360920de7eb48c012f80f64397d0c21e31: Status 404 returned error can't find the container with id 38d760c3a26f976e21b9ac112a59cb360920de7eb48c012f80f64397d0c21e31 Apr 16 18:21:35.109111 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.109088 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:35.113058 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.113035 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-e1f7b3-predictor-8674c76464-jw9vq"] Apr 16 18:21:35.743170 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:35.743131 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" path="/var/lib/kubelet/pods/fd991a4e-5318-480b-8567-9b37e2860f56/volumes" Apr 16 18:21:36.078057 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:36.077979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerStarted","Data":"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51"} Apr 16 18:21:36.078057 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:36.078017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerStarted","Data":"38d760c3a26f976e21b9ac112a59cb360920de7eb48c012f80f64397d0c21e31"} Apr 16 18:21:36.860929 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:36.860896 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:21:37.082894 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:37.082872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/0.log" Apr 16 18:21:37.082989 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:37.082910 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerID="7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51" exitCode=1 Apr 16 18:21:37.082989 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:37.082964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerDied","Data":"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51"} Apr 16 18:21:38.087630 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.087606 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/0.log" Apr 16 18:21:38.087991 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.087709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerStarted","Data":"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37"} Apr 16 18:21:38.462993 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.462970 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:21:38.566434 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.566404 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location\") pod \"34cb1cf8-1390-4752-8384-de949db66371\" (UID: \"34cb1cf8-1390-4752-8384-de949db66371\") " Apr 16 18:21:38.566718 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.566697 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34cb1cf8-1390-4752-8384-de949db66371" (UID: "34cb1cf8-1390-4752-8384-de949db66371"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:38.667869 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:38.667813 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cb1cf8-1390-4752-8384-de949db66371-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:21:39.091929 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.091843 2572 generic.go:358] "Generic (PLEG): container finished" podID="34cb1cf8-1390-4752-8384-de949db66371" containerID="91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729" exitCode=0 Apr 16 18:21:39.092356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.091932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" Apr 16 18:21:39.092356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.091931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerDied","Data":"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729"} Apr 16 18:21:39.092356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.091971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk" event={"ID":"34cb1cf8-1390-4752-8384-de949db66371","Type":"ContainerDied","Data":"4350ae413013b8d290067e353e8b84ae4f2dd6d869c54850c1373d842d891618"} Apr 16 18:21:39.092356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.091991 2572 scope.go:117] "RemoveContainer" containerID="91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729" Apr 16 18:21:39.101758 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.101609 2572 scope.go:117] "RemoveContainer" containerID="b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a" Apr 16 18:21:39.108197 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.108179 2572 scope.go:117] "RemoveContainer" containerID="91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729" Apr 16 18:21:39.108430 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:39.108413 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729\": container with ID starting with 91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729 not found: ID does not exist" containerID="91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729" Apr 16 18:21:39.108482 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.108440 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729"} err="failed to get container status \"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729\": rpc error: code = NotFound desc = could not find container \"91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729\": container with ID starting with 91ac26ea7203a847517871af4c49623d8163c90d8acef1c8214b462f4c4c1729 not found: ID does not exist" Apr 16 18:21:39.108482 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.108456 2572 scope.go:117] "RemoveContainer" containerID="b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a" Apr 16 18:21:39.108740 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:39.108703 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a\": container with ID starting with b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a not found: ID does not exist" containerID="b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a" Apr 16 18:21:39.108809 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.108745 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a"} err="failed to get container status \"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a\": rpc error: code = NotFound desc = could not find container \"b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a\": container with ID starting with b6bbdade7e7b92b6e241b74b5617f4ce29dbe8a00654d73d8043377ee8d3a76a not found: ID does not exist" Apr 16 18:21:39.119322 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.119301 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:21:39.121989 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.121972 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-e1f7b3-predictor-86f6c6f87c-snslk"] Apr 16 18:21:39.640632 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.640597 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:39.743534 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.743486 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cb1cf8-1390-4752-8384-de949db66371" path="/var/lib/kubelet/pods/34cb1cf8-1390-4752-8384-de949db66371/volumes" Apr 16 18:21:39.776757 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.776724 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:21:39.777041 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777026 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="storage-initializer" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777044 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="storage-initializer" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777059 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777067 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777083 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777092 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" Apr 16 18:21:39.777114 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777111 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.777382 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.777382 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777190 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="34cb1cf8-1390-4752-8384-de949db66371" containerName="kserve-container" Apr 16 18:21:39.777382 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777201 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.777382 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.777212 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd991a4e-5318-480b-8567-9b37e2860f56" containerName="storage-initializer" Apr 16 18:21:39.781591 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.781571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:39.783957 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.783938 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:21:39.790146 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.790121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:21:39.876933 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.876902 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location\") pod \"raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w\" (UID: \"4d7bacec-0328-48d0-ae16-83cff84fe07b\") " pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:39.978313 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.978238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location\") pod \"raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w\" (UID: \"4d7bacec-0328-48d0-ae16-83cff84fe07b\") " pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:39.978583 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:39.978566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location\") pod \"raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w\" (UID: \"4d7bacec-0328-48d0-ae16-83cff84fe07b\") " pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:40.091342 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:40.091311 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:40.096247 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:40.096212 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" containerID="cri-o://9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37" gracePeriod=30 Apr 16 18:21:40.201196 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:40.201169 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:21:40.204558 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:21:40.204530 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7bacec_0328_48d0_ae16_83cff84fe07b.slice/crio-f1ea3eecc6ba041544f57e6a14a1eb5586abf476df5b77748d9326b32e6eb6e2 WatchSource:0}: Error finding container f1ea3eecc6ba041544f57e6a14a1eb5586abf476df5b77748d9326b32e6eb6e2: Status 404 returned error can't find the container with id f1ea3eecc6ba041544f57e6a14a1eb5586abf476df5b77748d9326b32e6eb6e2 Apr 16 18:21:41.100421 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.100386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerStarted","Data":"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290"} Apr 16 18:21:41.100809 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.100428 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerStarted","Data":"f1ea3eecc6ba041544f57e6a14a1eb5586abf476df5b77748d9326b32e6eb6e2"} Apr 16 18:21:41.825451 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.825431 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/1.log" Apr 16 18:21:41.825750 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.825738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/0.log" Apr 16 18:21:41.825806 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.825796 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:41.993397 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.993370 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert\") pod \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " Apr 16 18:21:41.993553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.993428 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location\") pod \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\" (UID: \"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0\") " Apr 16 18:21:41.993710 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.993691 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" (UID: "2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:41.993751 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:41.993735 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" (UID: "2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:42.094627 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.094599 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:21:42.094627 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.094626 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0-cabundle-cert\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:21:42.103664 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.103640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/1.log" Apr 16 18:21:42.103973 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.103946 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77_2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/storage-initializer/0.log" Apr 16 18:21:42.104008 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.103977 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerID="9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37" exitCode=1 Apr 16 18:21:42.104053 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.104038 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" Apr 16 18:21:42.104092 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.104058 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerDied","Data":"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37"} Apr 16 18:21:42.104126 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.104093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77" event={"ID":"2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0","Type":"ContainerDied","Data":"38d760c3a26f976e21b9ac112a59cb360920de7eb48c012f80f64397d0c21e31"} Apr 16 18:21:42.104126 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.104108 2572 scope.go:117] "RemoveContainer" containerID="9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37" Apr 16 18:21:42.111946 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.111829 2572 scope.go:117] "RemoveContainer" containerID="7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51" Apr 16 18:21:42.118942 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.118932 2572 scope.go:117] "RemoveContainer" containerID="9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37" Apr 16 18:21:42.119176 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:42.119157 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37\": container with ID starting with 9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37 not found: ID does not exist" containerID="9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37" Apr 16 18:21:42.119224 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.119183 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37"} err="failed to get container status \"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37\": rpc error: code = NotFound desc = could not find container \"9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37\": container with ID starting with 9263020aef73b01418eb908659726a0274428eb9163af4e2e83b1bf4d1413d37 not found: ID does not exist" Apr 16 18:21:42.119224 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.119197 2572 scope.go:117] "RemoveContainer" containerID="7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51" Apr 16 18:21:42.119418 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:21:42.119403 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51\": container with ID starting with 7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51 not found: ID does not exist" containerID="7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51" Apr 16 18:21:42.119465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.119422 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51"} err="failed to get container status \"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51\": rpc error: code = NotFound desc = could not find container \"7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51\": container with ID starting with 7d62a619bc1ef2a73ec7d5e88211f30862c8ccb2e8d495ea52eb2d89d9e49b51 not found: ID does not exist" Apr 16 18:21:42.137895 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.137873 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:42.143308 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:42.143288 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-89c2e8-predictor-5cddc74476-w8w77"] Apr 16 18:21:43.743173 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:43.743141 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" path="/var/lib/kubelet/pods/2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0/volumes" Apr 16 18:21:45.113485 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:45.113450 2572 generic.go:358] "Generic (PLEG): container finished" podID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerID="2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290" exitCode=0 Apr 16 18:21:45.113863 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:45.113542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerDied","Data":"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290"} Apr 16 18:21:46.117475 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:46.117442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerStarted","Data":"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7"} Apr 16 18:21:46.117875 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:46.117731 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:21:46.118908 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:46.118877 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:21:46.134414 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:46.134369 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podStartSLOduration=7.134358416 podStartE2EDuration="7.134358416s" podCreationTimestamp="2026-04-16 18:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:46.13295585 +0000 UTC m=+1170.991673411" watchObservedRunningTime="2026-04-16 18:21:46.134358416 +0000 UTC m=+1170.993075733" Apr 16 18:21:47.120749 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:47.120711 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:21:57.120700 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:21:57.120656 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:22:07.121324 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:07.121285 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:22:15.688932 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:15.688905 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:22:15.689812 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:15.689789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:22:17.121095 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:17.121059 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:22:27.120840 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:27.120793 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:22:37.121049 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:37.121008 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 18:22:47.122112 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:47.122077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:22:49.855709 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.855678 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:22:49.856093 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.855916 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" containerID="cri-o://cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7" gracePeriod=30 Apr 16 18:22:49.916373 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916345 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:22:49.916647 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916633 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.916705 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916651 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.916705 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916665 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.916705 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916670 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.916795 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916721 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.916795 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.916734 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b5ad73b-bee4-4f01-9d1b-7a8e70a812e0" containerName="storage-initializer" Apr 16 18:22:49.919305 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.919287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:49.931102 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:49.931081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:22:50.056259 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:50.056215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location\") pod \"raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9\" (UID: \"c251f7bb-e117-4a3f-8151-f6f657784699\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:50.156999 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:50.156938 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location\") pod \"raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9\" (UID: \"c251f7bb-e117-4a3f-8151-f6f657784699\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:50.157235 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:50.157219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location\") pod \"raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9\" (UID: \"c251f7bb-e117-4a3f-8151-f6f657784699\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:50.229237 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:50.229212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:50.340304 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:50.340283 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:22:51.293880 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:51.293843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerStarted","Data":"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3"} Apr 16 18:22:51.293880 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:51.293872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerStarted","Data":"88d7dffce3c3d9e7fbd4f24393bd437cbf5c24fa9655399410eb7c0b15ace121"} Apr 16 18:22:53.587696 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:53.587673 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:22:53.683191 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:53.683114 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location\") pod \"4d7bacec-0328-48d0-ae16-83cff84fe07b\" (UID: \"4d7bacec-0328-48d0-ae16-83cff84fe07b\") " Apr 16 18:22:53.683435 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:53.683412 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d7bacec-0328-48d0-ae16-83cff84fe07b" (UID: "4d7bacec-0328-48d0-ae16-83cff84fe07b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:53.784195 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:53.784170 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d7bacec-0328-48d0-ae16-83cff84fe07b-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:22:54.302154 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.302120 2572 generic.go:358] "Generic (PLEG): container finished" podID="c251f7bb-e117-4a3f-8151-f6f657784699" containerID="b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3" exitCode=0 Apr 16 18:22:54.302341 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.302197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerDied","Data":"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3"} Apr 16 18:22:54.303467 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.303443 2572 generic.go:358] "Generic (PLEG): container finished" podID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerID="cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7" exitCode=0 Apr 16 18:22:54.303619 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.303545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerDied","Data":"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7"} Apr 16 18:22:54.303619 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.303600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" event={"ID":"4d7bacec-0328-48d0-ae16-83cff84fe07b","Type":"ContainerDied","Data":"f1ea3eecc6ba041544f57e6a14a1eb5586abf476df5b77748d9326b32e6eb6e2"} Apr 16 18:22:54.303733 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.303621 2572 scope.go:117] "RemoveContainer" containerID="cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7" Apr 16 18:22:54.303733 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.303562 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w" Apr 16 18:22:54.311227 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.311213 2572 scope.go:117] "RemoveContainer" containerID="2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290" Apr 16 18:22:54.317610 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.317584 2572 scope.go:117] "RemoveContainer" containerID="cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7" Apr 16 18:22:54.317833 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:22:54.317814 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7\": container with ID starting with cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7 not found: ID does not exist" containerID="cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7" Apr 16 18:22:54.317881 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.317840 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7"} err="failed to get container status \"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7\": rpc error: code = NotFound desc = could not find container \"cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7\": container with ID starting with cd4cdc4e02168245765bc5de6ae480d0c5ae94a665c7a7298a60eaba458f16a7 not found: ID does not exist" Apr 16 18:22:54.317881 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.317856 2572 scope.go:117] "RemoveContainer" containerID="2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290" Apr 16 18:22:54.318099 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:22:54.318081 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290\": container with ID starting with 2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290 not found: ID does not exist" containerID="2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290" Apr 16 18:22:54.318165 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.318106 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290"} err="failed to get container status \"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290\": rpc error: code = NotFound desc = could not find container \"2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290\": container with ID starting with 2a82f0b67d1add5df8e80d5e137e486ce849e284851cfddb1732d3b2a2ab7290 not found: ID does not exist" Apr 16 18:22:54.332940 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.332914 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:22:54.338648 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:54.338627 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-94e2d-predictor-864b7f6fd4-bp28w"] Apr 16 18:22:55.308236 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:55.308198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerStarted","Data":"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506"} Apr 16 18:22:55.308664 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:55.308526 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:22:55.309873 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:55.309843 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:22:55.324686 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:55.324651 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podStartSLOduration=6.32463959 podStartE2EDuration="6.32463959s" podCreationTimestamp="2026-04-16 18:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:55.32388415 +0000 UTC m=+1240.182601469" watchObservedRunningTime="2026-04-16 18:22:55.32463959 +0000 UTC m=+1240.183356907" Apr 16 18:22:55.743608 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:55.743573 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" path="/var/lib/kubelet/pods/4d7bacec-0328-48d0-ae16-83cff84fe07b/volumes" Apr 16 18:22:56.312583 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:22:56.312543 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:06.313192 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:06.313101 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:16.313356 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:16.313313 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:26.313001 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:26.312961 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:36.312773 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:36.312731 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:46.312832 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:46.312790 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:56.313718 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:23:56.313683 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:24:00.008168 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:00.008135 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:24:00.008578 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:00.008439 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" containerID="cri-o://5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506" gracePeriod=30 Apr 16 18:24:03.942009 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:03.941987 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:24:03.980240 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:03.980188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location\") pod \"c251f7bb-e117-4a3f-8151-f6f657784699\" (UID: \"c251f7bb-e117-4a3f-8151-f6f657784699\") " Apr 16 18:24:03.980465 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:03.980442 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c251f7bb-e117-4a3f-8151-f6f657784699" (UID: "c251f7bb-e117-4a3f-8151-f6f657784699"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:04.081419 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.081392 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c251f7bb-e117-4a3f-8151-f6f657784699-kserve-provision-location\") on node \"ip-10-0-128-209.ec2.internal\" DevicePath \"\"" Apr 16 18:24:04.491268 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.491239 2572 generic.go:358] "Generic (PLEG): container finished" podID="c251f7bb-e117-4a3f-8151-f6f657784699" containerID="5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506" exitCode=0 Apr 16 18:24:04.491418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.491309 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" Apr 16 18:24:04.491418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.491319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerDied","Data":"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506"} Apr 16 18:24:04.491418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.491358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9" event={"ID":"c251f7bb-e117-4a3f-8151-f6f657784699","Type":"ContainerDied","Data":"88d7dffce3c3d9e7fbd4f24393bd437cbf5c24fa9655399410eb7c0b15ace121"} Apr 16 18:24:04.491418 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.491373 2572 scope.go:117] "RemoveContainer" containerID="5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506" Apr 16 18:24:04.499439 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.499423 2572 scope.go:117] "RemoveContainer" containerID="b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3" Apr 16 18:24:04.506031 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.506013 2572 scope.go:117] "RemoveContainer" containerID="5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506" Apr 16 18:24:04.506257 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:24:04.506239 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506\": container with ID starting with 5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506 not found: ID does not exist" containerID="5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506" Apr 16 18:24:04.506302 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.506265 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506"} err="failed to get container status \"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506\": rpc error: code = NotFound desc = could not find container \"5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506\": container with ID starting with 5b13f803b7ce182e1c5d101666dd12a3a82bd19fd99ce07fd68e6a3cda479506 not found: ID does not exist" Apr 16 18:24:04.506302 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.506282 2572 scope.go:117] "RemoveContainer" containerID="b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3" Apr 16 18:24:04.506469 ip-10-0-128-209 kubenswrapper[2572]: E0416 18:24:04.506453 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3\": container with ID starting with b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3 not found: ID does not exist" containerID="b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3" Apr 16 18:24:04.506507 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.506476 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3"} err="failed to get container status \"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3\": rpc error: code = NotFound desc = could not find container \"b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3\": container with ID starting with b8ccc386dfc06b0bfd9b30cca0882774c7b28a1497c2f4b764be24fbc1085ac3 not found: ID does not exist" Apr 16 18:24:04.512369 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.512349 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:24:04.515920 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:04.515901 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9b339-predictor-5bdfb6c5f4-4dzp9"] Apr 16 18:24:05.743092 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:05.743060 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" path="/var/lib/kubelet/pods/c251f7bb-e117-4a3f-8151-f6f657784699/volumes" Apr 16 18:24:28.066404 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:28.066373 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-85dqd_316daf04-aff0-4816-9b83-eae46b1fd37b/global-pull-secret-syncer/0.log" Apr 16 18:24:28.279553 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:28.279522 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zkhsv_b8b228d4-bea7-4887-8dd2-672c2f8c5e45/konnectivity-agent/0.log" Apr 16 18:24:28.302572 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:28.302543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-209.ec2.internal_a9348010412cd69bb166b4f63c170f91/haproxy/0.log" Apr 16 18:24:32.251162 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.251126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-hsnh6_7365984a-9f2d-436f-bea8-7faf76f34ed0/kube-state-metrics/0.log" Apr 16 18:24:32.280149 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.280123 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-hsnh6_7365984a-9f2d-436f-bea8-7faf76f34ed0/kube-rbac-proxy-main/0.log" Apr 16 18:24:32.307852 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.307832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-hsnh6_7365984a-9f2d-436f-bea8-7faf76f34ed0/kube-rbac-proxy-self/0.log" Apr 16 18:24:32.401318 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.401298 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mzpt_800daf97-3eb8-47d8-abed-a15df1b37ef8/node-exporter/0.log" Apr 16 18:24:32.426068 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.426046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mzpt_800daf97-3eb8-47d8-abed-a15df1b37ef8/kube-rbac-proxy/0.log" Apr 16 18:24:32.454494 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:32.454477 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mzpt_800daf97-3eb8-47d8-abed-a15df1b37ef8/init-textfile/0.log" Apr 16 18:24:34.928007 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.927929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z"] Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928202 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928215 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928226 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="storage-initializer" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928233 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="storage-initializer" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928240 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928246 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928256 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="storage-initializer" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928262 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="storage-initializer" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928308 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d7bacec-0328-48d0-ae16-83cff84fe07b" containerName="kserve-container" Apr 16 18:24:34.928417 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.928315 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c251f7bb-e117-4a3f-8151-f6f657784699" containerName="kserve-container" Apr 16 18:24:34.932349 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.932327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:34.935033 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.935013 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"kube-root-ca.crt\"" Apr 16 18:24:34.935133 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.935014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4f7nz\"/\"default-dockercfg-h5b95\"" Apr 16 18:24:34.936096 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.936080 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"openshift-service-ca.crt\"" Apr 16 18:24:34.940624 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.940605 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z"] Apr 16 18:24:34.996683 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.996660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-proc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:34.996785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.996697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-sys\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:34.996785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.996715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-podres\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:34.996785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.996731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrftc\" (UniqueName: \"kubernetes.io/projected/a6ad8e02-1690-4684-9c66-47366748bc29-kube-api-access-mrftc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:34.996785 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:34.996774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-lib-modules\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097759 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-proc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-proc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-sys\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-sys\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-podres\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.097872 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrftc\" (UniqueName: \"kubernetes.io/projected/a6ad8e02-1690-4684-9c66-47366748bc29-kube-api-access-mrftc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.098091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-lib-modules\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.098091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.097986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-podres\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.098091 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.098014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ad8e02-1690-4684-9c66-47366748bc29-lib-modules\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.105822 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.105801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrftc\" (UniqueName: \"kubernetes.io/projected/a6ad8e02-1690-4684-9c66-47366748bc29-kube-api-access-mrftc\") pod \"perf-node-gather-daemonset-q8s7z\" (UID: \"a6ad8e02-1690-4684-9c66-47366748bc29\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.242243 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.242171 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.355573 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.355508 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z"] Apr 16 18:24:35.357966 ip-10-0-128-209 kubenswrapper[2572]: W0416 18:24:35.357935 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda6ad8e02_1690_4684_9c66_47366748bc29.slice/crio-063a5c7b424b8792bb0867d0471c55320bab090d3f821d0ad7876c36d9cae8a2 WatchSource:0}: Error finding container 063a5c7b424b8792bb0867d0471c55320bab090d3f821d0ad7876c36d9cae8a2: Status 404 returned error can't find the container with id 063a5c7b424b8792bb0867d0471c55320bab090d3f821d0ad7876c36d9cae8a2 Apr 16 18:24:35.575142 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.575088 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" event={"ID":"a6ad8e02-1690-4684-9c66-47366748bc29","Type":"ContainerStarted","Data":"0e0156bb952abcc8f8f47655ddee9560456ee1d352fce5590a6f16c7df49a462"} Apr 16 18:24:35.575142 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.575120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" event={"ID":"a6ad8e02-1690-4684-9c66-47366748bc29","Type":"ContainerStarted","Data":"063a5c7b424b8792bb0867d0471c55320bab090d3f821d0ad7876c36d9cae8a2"} Apr 16 18:24:35.575142 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.575142 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:35.595968 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:35.595933 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" podStartSLOduration=1.5959196960000002 podStartE2EDuration="1.595919696s" podCreationTimestamp="2026-04-16 18:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:24:35.594998709 +0000 UTC m=+1340.453716049" watchObservedRunningTime="2026-04-16 18:24:35.595919696 +0000 UTC m=+1340.454637012" Apr 16 18:24:36.293019 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:36.292993 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wmkzd_237a647b-4edf-4b65-ad09-e3f76a13c168/dns/0.log" Apr 16 18:24:36.319830 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:36.319807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wmkzd_237a647b-4edf-4b65-ad09-e3f76a13c168/kube-rbac-proxy/0.log" Apr 16 18:24:36.393391 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:36.393360 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zdnrz_045160d0-0fd3-47d2-90ec-0bb2af115ef2/dns-node-resolver/0.log" Apr 16 18:24:36.880643 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:36.880623 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zbk52_f601d51e-6912-402c-abe7-76ac16678f2a/node-ca/0.log" Apr 16 18:24:37.987678 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:37.987653 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kqct2_f8b0f748-a8be-4032-a386-74c3dc7ad240/serve-healthcheck-canary/0.log" Apr 16 18:24:38.399752 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:38.399719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qwzcn_dee71a84-286e-4bb9-bf9b-746f703763c2/kube-rbac-proxy/0.log" Apr 16 18:24:38.425182 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:38.425161 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qwzcn_dee71a84-286e-4bb9-bf9b-746f703763c2/exporter/0.log" Apr 16 18:24:38.448914 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:38.448890 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qwzcn_dee71a84-286e-4bb9-bf9b-746f703763c2/extractor/0.log" Apr 16 18:24:40.546358 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:40.546326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-mt87x_e40c940c-a76b-46f0-9dd0-d0d9e342d64a/manager/0.log" Apr 16 18:24:40.657419 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:40.655792 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-9qdgh_593a3f3a-2b1a-4187-879f-c067e73cc4a3/manager/0.log" Apr 16 18:24:40.704786 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:40.704762 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-sq6jw_2773cd95-b80a-4fff-aae0-64c92391563b/seaweedfs/0.log" Apr 16 18:24:41.586132 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:41.586101 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-q8s7z" Apr 16 18:24:46.284973 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.284953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hksp_d5943e99-4c81-4af3-a008-f184fe0a2d79/kube-multus/0.log" Apr 16 18:24:46.676357 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.676328 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/kube-multus-additional-cni-plugins/0.log" Apr 16 18:24:46.701654 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.701635 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/egress-router-binary-copy/0.log" Apr 16 18:24:46.723428 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.723411 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/cni-plugins/0.log" Apr 16 18:24:46.746565 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.746543 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/bond-cni-plugin/0.log" Apr 16 18:24:46.769287 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.769266 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/routeoverride-cni/0.log" Apr 16 18:24:46.793132 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.793109 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/whereabouts-cni-bincopy/0.log" Apr 16 18:24:46.818876 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.818858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vtf28_f781e5cc-b111-4034-8a85-cae2e3e72a72/whereabouts-cni/0.log" Apr 16 18:24:46.957297 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.957236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2k4qz_5c4e7715-635e-4cb8-b891-8d2f74e1ef9c/network-metrics-daemon/0.log" Apr 16 18:24:46.978591 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:46.978572 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2k4qz_5c4e7715-635e-4cb8-b891-8d2f74e1ef9c/kube-rbac-proxy/0.log" Apr 16 18:24:48.716896 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.716868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-controller/0.log" Apr 16 18:24:48.739143 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.739115 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/0.log" Apr 16 18:24:48.747638 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.747611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovn-acl-logging/1.log" Apr 16 18:24:48.769770 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.769752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/kube-rbac-proxy-node/0.log" Apr 16 18:24:48.793506 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.793483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:24:48.815118 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.815103 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/northd/0.log" Apr 16 18:24:48.837484 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.837446 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/nbdb/0.log" Apr 16 18:24:48.862023 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.862002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/sbdb/0.log" Apr 16 18:24:48.954198 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:48.954175 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x27gf_533bfb3b-fb81-47d8-a968-aa3baab674a7/ovnkube-controller/0.log" Apr 16 18:24:50.040751 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:50.040725 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jp7mf_164700ca-d6d4-4aee-86e6-4fca944bb4b5/network-check-target-container/0.log" Apr 16 18:24:50.968452 ip-10-0-128-209 kubenswrapper[2572]: I0416 18:24:50.968425 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f2fhk_40ee05ef-cbcc-43c7-8d8e-d8c52630c3cd/iptables-alerter/0.log"