Apr 16 16:45:02.424679 ip-10-0-130-1 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:45:02.424688 ip-10-0-130-1 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:45:02.424695 ip-10-0-130-1 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:45:02.425017 ip-10-0-130-1 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:45:12.527974 ip-10-0-130-1 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:45:12.527997 ip-10-0-130-1 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b17ecd4ed831457eb740add9429ed6d1 -- Apr 16 16:47:22.769204 ip-10-0-130-1 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:47:23.265395 ip-10-0-130-1 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:23.265395 ip-10-0-130-1 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:47:23.265395 ip-10-0-130-1 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:23.265395 ip-10-0-130-1 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:47:23.265395 ip-10-0-130-1 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:23.268620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.268536 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:47:23.271024 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271009 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:23.271024 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271024 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271027 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271030 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271033 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271035 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271039 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271042 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271045 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271047 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271050 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271052 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271060 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271062 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271065 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271067 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271070 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271073 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271075 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271078 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271080 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:23.271087 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271083 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271085 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271088 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271091 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271094 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271096 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271100 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271103 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271105 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271108 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271111 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271114 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271116 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271119 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271121 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271123 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271127 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271129 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271131 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271134 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:23.271597 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271136 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271139 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271143 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271146 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271149 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271152 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271154 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271157 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271159 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271161 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271164 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271166 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271169 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271172 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271175 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271177 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271180 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271182 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271185 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271187 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:23.272130 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271190 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271192 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271195 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271199 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271203 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271206 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271209 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271211 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271214 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271216 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271219 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271221 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271224 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271227 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271229 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271231 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271234 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271237 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271239 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:23.272652 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271242 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271244 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271247 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271249 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271251 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271254 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271630 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271636 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271640 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271643 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271646 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271650 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271653 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271656 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271658 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271662 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271665 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271668 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271670 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271673 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:23.273193 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271675 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271678 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271680 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271683 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271685 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271688 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271690 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271693 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271695 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271697 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271700 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271703 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271705 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271707 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271710 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271712 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271715 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271725 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271727 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:23.273697 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271730 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271733 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271735 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271738 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271741 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271743 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271746 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271748 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271751 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271754 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271756 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271759 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271762 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271764 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271767 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271769 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271771 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271774 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271776 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271779 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:23.274213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271781 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271784 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271786 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271789 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271792 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271794 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271797 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271799 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271801 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271804 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271807 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271809 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271813 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271815 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271818 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271820 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271823 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271825 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271828 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271830 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:23.274732 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271833 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271835 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271838 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271840 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271843 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271846 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271849 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271851 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271854 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271856 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271859 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271863 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.271865 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273282 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273292 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273298 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273303 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273307 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273310 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273314 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273319 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:47:23.275272 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273322 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273325 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273329 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273332 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273336 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273339 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273342 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273344 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273347 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273350 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273353 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273356 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273359 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273362 2576 flags.go:64] FLAG: --config-dir="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273365 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273368 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273373 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273376 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273379 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273383 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273386 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273389 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273391 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273394 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273397 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:47:23.275832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273401 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273404 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273407 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273410 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273413 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273416 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273421 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273424 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273427 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273431 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273434 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273437 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273440 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273443 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273446 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273449 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273452 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273455 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273457 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273460 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273463 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273466 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273470 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273472 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273475 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:47:23.276486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273478 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273481 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273484 2576 flags.go:64] FLAG: --help="false" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273487 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273490 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273493 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273496 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273499 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273502 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273505 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273508 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273510 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273513 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273516 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273519 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273522 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273526 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273529 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273532 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273534 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273537 2576 flags.go:64] FLAG: --lock-file="" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273540 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273543 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273546 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:47:23.277156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273551 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273554 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273556 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273559 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273562 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273565 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273568 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273570 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273575 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273578 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273582 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273585 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273588 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273591 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273594 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273597 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273602 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273605 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273613 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273616 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273619 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273622 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273625 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:47:23.277769 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273631 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273633 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273636 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273639 2576 flags.go:64] FLAG: --port="10250" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273642 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273645 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f814d25bfd881672" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273648 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273651 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273654 2576 flags.go:64] FLAG: --register-node="true" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273657 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273660 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273664 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273666 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273669 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273672 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273676 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273679 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273681 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273684 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273687 2576 flags.go:64] FLAG: --runonce="false" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273690 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273693 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273696 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273699 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273702 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273707 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:47:23.278427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273710 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273716 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273719 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273721 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273724 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273728 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273731 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273734 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273737 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273743 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273745 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273748 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273752 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273754 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273757 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273760 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273762 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273765 2576 flags.go:64] FLAG: --v="2" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273769 2576 flags.go:64] FLAG: --version="false" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273773 2576 flags.go:64] FLAG: --vmodule="" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273777 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.273781 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273880 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273884 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:23.279183 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273887 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273890 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273892 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273895 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273897 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273900 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273902 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273906 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273908 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273912 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273914 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273917 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273919 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273922 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273925 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273928 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273930 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273934 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273956 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:23.279843 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273962 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273968 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273972 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273975 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273978 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273980 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273983 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273986 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273988 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273991 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273994 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273996 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.273999 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274002 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274004 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274007 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274009 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274011 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274014 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274016 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:23.280707 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274019 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274022 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274025 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274028 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274030 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274033 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274035 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274038 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274042 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274045 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274047 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274050 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274052 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274055 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274057 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274060 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274062 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274065 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274067 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274070 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:23.281458 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274072 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274075 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274078 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274080 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274083 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274085 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274088 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274090 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274093 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274095 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274098 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274100 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274102 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274105 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274109 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274112 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274114 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274117 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274119 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274122 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:23.282006 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274124 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274127 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274130 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274132 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.274135 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.274822 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.282532 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:47:23.282569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.282546 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282595 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282601 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282604 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282607 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282610 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282613 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282616 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282619 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282621 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282624 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282627 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282629 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282632 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282634 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282637 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282639 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282642 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282644 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282647 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:23.282771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282649 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282652 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282654 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282656 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282659 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282661 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282664 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282666 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282668 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282676 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282678 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282681 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282683 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282686 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282689 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282692 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282695 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282697 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282700 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282703 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:23.283311 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282705 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282708 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282711 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282713 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282716 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282718 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282721 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282723 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282725 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282728 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282730 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282734 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282738 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282740 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282743 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282746 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282748 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282751 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282753 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:23.283829 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282756 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282758 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282761 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282763 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282766 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282769 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282772 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282775 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282778 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282781 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282783 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282786 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282790 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282794 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282797 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282800 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282802 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282805 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282809 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282811 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:23.284335 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282814 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282816 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282819 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282821 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282824 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282826 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282829 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282831 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.282836 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282971 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282977 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282981 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282983 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282986 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282989 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282992 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:23.284868 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282995 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.282997 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283000 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283004 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283006 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283009 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283011 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283014 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283016 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283019 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283022 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283024 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283027 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283029 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283032 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283050 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283054 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283057 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283059 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283062 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:23.285400 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283065 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283069 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283073 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283076 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283078 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283081 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283083 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283086 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283088 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283091 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283093 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283096 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283098 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283101 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283103 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283106 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283108 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283111 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283114 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283116 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:23.285925 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283118 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283121 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283123 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283126 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283129 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283131 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283133 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283136 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283139 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283142 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283144 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283146 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283149 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283151 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283153 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283156 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283158 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283161 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283163 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283166 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:23.286457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283168 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283170 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283173 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283175 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283178 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283180 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283183 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283186 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283188 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283191 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283193 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283197 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283200 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283203 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283206 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283208 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283211 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283213 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:23.286990 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:23.283216 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:23.287455 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.283221 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:23.287455 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.284011 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:47:23.288221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.288207 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:47:23.289259 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.289248 2576 server.go:1019] "Starting client certificate rotation" Apr 16 16:47:23.289361 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.289343 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:23.289392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.289381 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:23.320148 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.320131 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:23.323599 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.323586 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:23.338662 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.338638 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:47:23.348479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.348458 2576 log.go:25] "Validated CRI v1 image API" Apr 16 16:47:23.353702 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.353685 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:47:23.356954 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.356918 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:23.358138 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.358118 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c541da94-0302-49e6-837b-1168e5f17b58:/dev/nvme0n1p3 ff84f41e-c6ca-4f89-ad46-33ab52a2ca2c:/dev/nvme0n1p4] Apr 16 16:47:23.358184 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.358139 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:47:23.364124 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.364017 2576 manager.go:217] Machine: {Timestamp:2026-04-16 16:47:23.361799701 +0000 UTC m=+0.459678787 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101111 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2567b8b99e18b43c61b368795ab338 SystemUUID:ec2567b8-b99e-18b4-3c61-b368795ab338 BootID:b17ecd4e-d831-457e-b740-add9429ed6d1 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:84:85:f4:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:84:85:f4:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:81:2b:f0:a0:eb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:47:23.364124 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.364120 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:47:23.364223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.364194 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:47:23.365395 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.365375 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:47:23.365542 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.365398 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-1.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:47:23.365583 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.365551 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:47:23.365583 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.365560 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:47:23.365583 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.365576 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:23.366927 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.366916 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:23.368133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.368123 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:23.368230 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.368221 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:47:23.370842 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.370832 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:47:23.370884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.370847 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:47:23.370884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.370858 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:47:23.370884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.370868 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:47:23.370884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.370878 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:47:23.372101 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.372088 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:23.372211 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.372107 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:23.376398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.376385 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:47:23.378752 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.378739 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:47:23.380436 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380425 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380442 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380450 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380456 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380461 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380467 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380472 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:47:23.380479 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380478 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:47:23.380661 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380485 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:47:23.380661 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380491 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:47:23.380661 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380517 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:47:23.380661 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.380526 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:47:23.382056 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.382042 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:47:23.382106 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.382058 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:47:23.383267 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.383243 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-1.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:47:23.383316 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.383260 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:47:23.385611 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.385598 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:47:23.385674 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.385635 2576 server.go:1295] "Started kubelet" Apr 16 16:47:23.385737 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.385714 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:47:23.385783 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.385723 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:47:23.385827 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.385808 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:47:23.386511 ip-10-0-130-1 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:47:23.387151 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.387134 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:47:23.388827 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.388814 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:47:23.395162 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.395130 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-1.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:47:23.396240 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.395081 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-1.ec2.internal.18a6e43a99e841a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-1.ec2.internal,UID:ip-10-0-130-1.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-1.ec2.internal,},FirstTimestamp:2026-04-16 16:47:23.385610664 +0000 UTC m=+0.483489749,LastTimestamp:2026-04-16 16:47:23.385610664 +0000 UTC m=+0.483489749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-1.ec2.internal,}" Apr 16 16:47:23.398191 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.398172 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:23.398757 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.398741 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:47:23.398993 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.398974 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:47:23.399567 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399551 2576 factory.go:55] Registering systemd factory Apr 16 16:47:23.399661 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399574 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:47:23.399721 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399712 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:47:23.399768 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399728 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:47:23.399768 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399746 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:47:23.399851 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399779 2576 factory.go:153] Registering CRI-O factory Apr 16 16:47:23.399851 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399789 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 16:47:23.399962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399856 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:47:23.399962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399857 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:47:23.399962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399862 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:47:23.399962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399879 2576 factory.go:103] Registering Raw factory Apr 16 16:47:23.399962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.399893 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 16:47:23.400206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.400184 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f7whc" Apr 16 16:47:23.400291 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.400274 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.400580 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.400568 2576 manager.go:319] Starting recovery of all containers Apr 16 16:47:23.402158 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.402137 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:47:23.402333 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.402227 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-1.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:47:23.408441 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.408214 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f7whc" Apr 16 16:47:23.410601 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.410148 2576 manager.go:324] Recovery completed Apr 16 16:47:23.414454 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.414440 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.416714 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.416700 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.416778 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.416725 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.416778 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.416734 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.417229 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.417214 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:47:23.417229 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.417226 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:47:23.417335 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.417239 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:23.418634 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.418573 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-1.ec2.internal.18a6e43a9bc2d915 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-1.ec2.internal,UID:ip-10-0-130-1.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-1.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-1.ec2.internal,},FirstTimestamp:2026-04-16 16:47:23.416713493 +0000 UTC m=+0.514592559,LastTimestamp:2026-04-16 16:47:23.416713493 +0000 UTC m=+0.514592559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-1.ec2.internal,}" Apr 16 16:47:23.420065 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.420051 2576 policy_none.go:49] "None policy: Start" Apr 16 16:47:23.420123 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.420069 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:47:23.420123 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.420080 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:47:23.465490 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465473 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.465509 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465522 2576 server.go:85] "Starting device plugin registration server" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465703 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465712 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465808 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465866 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.465874 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.466409 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:47:23.471330 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.466450 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.503111 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.503082 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:47:23.504286 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.504268 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:47:23.504286 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.504289 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:47:23.504414 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.504304 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:47:23.504414 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.504310 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:47:23.504414 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.504339 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:47:23.508396 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.508377 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:23.566215 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.566173 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.567381 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.567368 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.567454 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.567393 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.567454 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.567403 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.567454 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.567429 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.576031 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.576012 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.576114 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.576035 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-1.ec2.internal\": node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.593187 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.593167 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.604775 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.604758 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal"] Apr 16 16:47:23.604845 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.604817 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.605562 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.605546 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.605657 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.605570 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.605657 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.605580 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.606625 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.606613 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.606789 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.606777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.606839 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.606804 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.614968 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.614926 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.615061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.614926 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.615061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.615009 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.615061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.615028 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.615061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.614976 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.615061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.615061 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.616623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.616600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.616694 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.616635 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:23.617478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.617463 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:23.617544 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.617489 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:23.617544 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.617500 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:23.640587 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.640570 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-1.ec2.internal\" not found" node="ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.645236 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.645222 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-1.ec2.internal\" not found" node="ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.693555 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.693536 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.701924 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.701905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2d196e213c72dc0e82e18661b4a3b77-config\") pod \"kube-apiserver-proxy-ip-10-0-130-1.ec2.internal\" (UID: \"b2d196e213c72dc0e82e18661b4a3b77\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.701995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.701930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.701995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.701966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.794053 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.794025 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.802449 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2d196e213c72dc0e82e18661b4a3b77-config\") pod \"kube-apiserver-proxy-ip-10-0-130-1.ec2.internal\" (UID: \"b2d196e213c72dc0e82e18661b4a3b77\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.802501 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.802501 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.802569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.802569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b2d196e213c72dc0e82e18661b4a3b77-config\") pod \"kube-apiserver-proxy-ip-10-0-130-1.ec2.internal\" (UID: \"b2d196e213c72dc0e82e18661b4a3b77\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.802569 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.802544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fd85adef90a5128f2b5c6f8eebd4a888-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal\" (UID: \"fd85adef90a5128f2b5c6f8eebd4a888\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.894874 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.894827 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:23.943208 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.943193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.947614 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:23.947597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:23.995057 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:23.995035 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:24.095599 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:24.095569 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:24.196103 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:24.196056 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:24.288647 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.288627 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:47:24.289115 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.288755 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:24.296776 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:24.296758 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-1.ec2.internal\" not found" Apr 16 16:47:24.372383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.372363 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:24.398254 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.398233 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:24.399810 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.399794 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" Apr 16 16:47:24.408186 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.408171 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:24.411146 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.411118 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:42:23 +0000 UTC" deadline="2027-10-14 04:25:58.31491624 +0000 UTC" Apr 16 16:47:24.411221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.411146 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13091h38m33.903774874s" Apr 16 16:47:24.412262 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.412240 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:24.413897 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.413880 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" Apr 16 16:47:24.420268 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.420251 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:24.440267 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:24.440245 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd85adef90a5128f2b5c6f8eebd4a888.slice/crio-e4f43048491dc0881e04b28d1aaa50febf5b52d1f5180a4981f174c3e41d259a WatchSource:0}: Error finding container e4f43048491dc0881e04b28d1aaa50febf5b52d1f5180a4981f174c3e41d259a: Status 404 returned error can't find the container with id e4f43048491dc0881e04b28d1aaa50febf5b52d1f5180a4981f174c3e41d259a Apr 16 16:47:24.440750 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:24.440728 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d196e213c72dc0e82e18661b4a3b77.slice/crio-b2c2b851fcb8d49fc2a9d2d19f8e05527e02091b9d7c6cd820ce4392d7bbc2d1 WatchSource:0}: Error finding container b2c2b851fcb8d49fc2a9d2d19f8e05527e02091b9d7c6cd820ce4392d7bbc2d1: Status 404 returned error can't find the container with id b2c2b851fcb8d49fc2a9d2d19f8e05527e02091b9d7c6cd820ce4392d7bbc2d1 Apr 16 16:47:24.446428 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.446397 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:47:24.458494 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.458477 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7lxdl" Apr 16 16:47:24.466779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.466761 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7lxdl" Apr 16 16:47:24.506515 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.506477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" event={"ID":"fd85adef90a5128f2b5c6f8eebd4a888","Type":"ContainerStarted","Data":"e4f43048491dc0881e04b28d1aaa50febf5b52d1f5180a4981f174c3e41d259a"} Apr 16 16:47:24.507310 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.507292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" event={"ID":"b2d196e213c72dc0e82e18661b4a3b77","Type":"ContainerStarted","Data":"b2c2b851fcb8d49fc2a9d2d19f8e05527e02091b9d7c6cd820ce4392d7bbc2d1"} Apr 16 16:47:24.666776 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.666760 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:24.938275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:24.938202 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:25.371830 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.371769 2576 apiserver.go:52] "Watching apiserver" Apr 16 16:47:25.380464 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.380438 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:47:25.380784 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.380758 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4l5qj","openshift-multus/network-metrics-daemon-dc7qq","openshift-network-operator/iptables-alerter-9xzz6","openshift-cluster-node-tuning-operator/tuned-8fpkc","openshift-dns/node-resolver-4l9b9","openshift-image-registry/node-ca-k2zl5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal","openshift-multus/multus-xgfzz","openshift-network-diagnostics/network-check-target-b2qz6","openshift-ovn-kubernetes/ovnkube-node-6dbws","kube-system/konnectivity-agent-znpbx","kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg"] Apr 16 16:47:25.382211 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.382188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.383552 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.383528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.383657 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.383611 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:25.384577 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.384558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.385133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.385081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:47:25.385677 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.385657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.386296 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.386222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:47:25.386296 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.386243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:47:25.386296 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.386252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.386504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.386296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:47:25.386504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.386322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.387024 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.387005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.387799 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.387650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.387799 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.387671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jx86s\"" Apr 16 16:47:25.387799 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.387716 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:47:25.388072 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.388050 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.388205 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.388185 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b4xpn\"" Apr 16 16:47:25.388425 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.388408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.389133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.388840 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.389133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.388900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.389283 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.389262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.389541 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.389523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-l4c4x\"" Apr 16 16:47:25.389631 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.389615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.390048 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.390026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6nfnn\"" Apr 16 16:47:25.390128 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.390111 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.390811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.390792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:25.390900 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.390856 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:25.392117 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392097 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.392203 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.392203 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fjtgd\"" Apr 16 16:47:25.392203 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.392447 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392426 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:47:25.392514 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:47:25.392899 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392718 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-657sj\"" Apr 16 16:47:25.392899 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392743 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.392899 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:47:25.392899 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.392848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.393826 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.393771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.394613 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.394597 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:47:25.395033 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.394931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.395360 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.395203 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f8b9c\"" Apr 16 16:47:25.395360 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.395253 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402356 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-88mzj\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402805 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.402924 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4shpx\"" Apr 16 16:47:25.403147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.403025 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:47:25.404459 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.404441 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:47:25.409674 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4267203-3686-4c79-a755-afbc3279763c-ovn-node-metrics-cert\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.409743 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.409743 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwmd\" (UniqueName: \"kubernetes.io/projected/9e2838a2-f0a3-4285-86fb-f54be274ccfa-kube-api-access-xgwmd\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.409743 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-system-cni-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.409905 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-socket-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.409905 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.409905 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-tmp-dir\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.409905 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-netns\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.409905 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-log-socket\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4ck\" (UniqueName: \"kubernetes.io/projected/a4267203-3686-4c79-a755-afbc3279763c-kube-api-access-bc4ck\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-lib-modules\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.409992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a59ddb8d-570d-4782-afa6-1d2e490cf42f-konnectivity-ca\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32688da-6a07-4fb0-a11d-64239ab022f4-host\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-device-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-tmp\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-multus\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.410157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-etc-kubernetes\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-binary-copy\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-bin\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-sys\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-k8s-cni-cncf-io\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-conf-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-iptables-alerter-script\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-systemd\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-netd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbfn\" (UniqueName: \"kubernetes.io/projected/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-kube-api-access-4cbfn\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-kubelet\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-var-lib-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-etc-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-script-lib\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.410588 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-var-lib-kubelet\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-socket-dir-parent\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-multus-certs\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k852c\" (UniqueName: \"kubernetes.io/projected/c32688da-6a07-4fb0-a11d-64239ab022f4-kube-api-access-k852c\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-registration-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-systemd-units\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-slash\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-daemon-config\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-cnibin\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-env-overrides\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-sys-fs\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-netns\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-ovn\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-kubernetes\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-bin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.410979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbv79\" (UniqueName: \"kubernetes.io/projected/0f7cce27-fc9e-437d-9147-a82b82151b07-kube-api-access-pbv79\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-modprobe-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cnibin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-kubelet\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-hostroot\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-os-release\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-systemd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-run\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-host\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-os-release\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrmx\" (UniqueName: \"kubernetes.io/projected/20a28d9b-77a3-42fd-bd13-72cb783d8673-kube-api-access-jtrmx\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.411838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a59ddb8d-570d-4782-afa6-1d2e490cf42f-agent-certs\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c32688da-6a07-4fb0-a11d-64239ab022f4-serviceca\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-node-log\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysconfig\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-conf\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-system-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-host-slash\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzgx\" (UniqueName: \"kubernetes.io/projected/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-kube-api-access-lvzgx\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg5v\" (UniqueName: \"kubernetes.io/projected/810ea4ed-899e-4f5f-908f-30d9aff93364-kube-api-access-lhg5v\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-config\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-etc-tuned\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4ts\" (UniqueName: \"kubernetes.io/projected/477dec0e-139a-477a-85c0-0229e6e2398d-kube-api-access-vc4ts\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-hosts-file\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.412620 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.411831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cni-binary-copy\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.467693 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.467667 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:24 +0000 UTC" deadline="2027-12-06 22:48:07.361697616 +0000 UTC" Apr 16 16:47:25.467693 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.467691 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14382h0m41.894010216s" Apr 16 16:47:25.512154 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4267203-3686-4c79-a755-afbc3279763c-ovn-node-metrics-cert\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512253 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.512253 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwmd\" (UniqueName: \"kubernetes.io/projected/9e2838a2-f0a3-4285-86fb-f54be274ccfa-kube-api-access-xgwmd\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.512253 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-system-cni-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-socket-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-tmp-dir\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-netns\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-system-cni-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-log-socket\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-log-socket\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4ck\" (UniqueName: \"kubernetes.io/projected/a4267203-3686-4c79-a755-afbc3279763c-kube-api-access-bc4ck\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-socket-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-lib-modules\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-netns\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a59ddb8d-570d-4782-afa6-1d2e490cf42f-konnectivity-ca\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32688da-6a07-4fb0-a11d-64239ab022f4-host\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-device-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c32688da-6a07-4fb0-a11d-64239ab022f4-host\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512640 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:47:25.512811 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-lib-modules\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-device-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.512912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-tmp-dir\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-tmp\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-multus\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-etc-kubernetes\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513261 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-binary-copy\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-bin\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-sys\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-k8s-cni-cncf-io\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-conf-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a59ddb8d-570d-4782-afa6-1d2e490cf42f-konnectivity-ca\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-iptables-alerter-script\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-multus\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-systemd\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.513478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-etc-kubernetes\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513820 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-bin\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.513820 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-netd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.513820 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513509 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-k8s-cni-cncf-io\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.513820 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-systemd\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.513962 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.513837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-sys\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.514044 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-conf-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.514408 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.514386 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:25.514466 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-binary-copy\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.514511 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.514479 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:26.014447952 +0000 UTC m=+3.112327022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:25.514574 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbfn\" (UniqueName: \"kubernetes.io/projected/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-kube-api-access-4cbfn\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.514636 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-iptables-alerter-script\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.514636 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-kubelet\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514636 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-var-lib-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-etc-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-script-lib\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-var-lib-kubelet\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-kubelet\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-socket-dir-parent\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.514922 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-var-lib-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.514922 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-var-lib-kubelet\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.514922 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-etc-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515075 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.514981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-multus-certs\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.515075 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k852c\" (UniqueName: \"kubernetes.io/projected/c32688da-6a07-4fb0-a11d-64239ab022f4-kube-api-access-k852c\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.515075 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-registration-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.515075 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-socket-dir-parent\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-systemd-units\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-registration-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-slash\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-run-multus-certs\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-slash\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-systemd-units\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-cni-netd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-daemon-config\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.515426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-cnibin\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.515426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-env-overrides\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-script-lib\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-cnibin\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-sys-fs\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-netns\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-sys-fs\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-ovn\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-run-netns\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-kubernetes\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-bin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-env-overrides\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-ovn\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbv79\" (UniqueName: \"kubernetes.io/projected/0f7cce27-fc9e-437d-9147-a82b82151b07-kube-api-access-pbv79\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-kubernetes\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-modprobe-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.515833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810ea4ed-899e-4f5f-908f-30d9aff93364-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-multus-daemon-config\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-cni-bin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-modprobe-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.515994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cnibin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-kubelet\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-hostroot\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-d\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-os-release\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-systemd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-host-var-lib-kubelet\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-run\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-host\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cnibin\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-os-release\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516221 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-os-release\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-hostroot\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20a28d9b-77a3-42fd-bd13-72cb783d8673-os-release\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrmx\" (UniqueName: \"kubernetes.io/projected/20a28d9b-77a3-42fd-bd13-72cb783d8673-kube-api-access-jtrmx\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-systemd\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a59ddb8d-570d-4782-afa6-1d2e490cf42f-agent-certs\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-run\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c32688da-6a07-4fb0-a11d-64239ab022f4-serviceca\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-host\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-node-log\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysconfig\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-conf\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-system-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-host-slash\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.516649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzgx\" (UniqueName: \"kubernetes.io/projected/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-kube-api-access-lvzgx\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg5v\" (UniqueName: \"kubernetes.io/projected/810ea4ed-899e-4f5f-908f-30d9aff93364-kube-api-access-lhg5v\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-config\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-etc-tuned\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4ts\" (UniqueName: \"kubernetes.io/projected/477dec0e-139a-477a-85c0-0229e6e2398d-kube-api-access-vc4ts\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-hosts-file\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cni-binary-copy\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.517295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.516796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.517529 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20a28d9b-77a3-42fd-bd13-72cb783d8673-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.517529 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e2838a2-f0a3-4285-86fb-f54be274ccfa-cni-binary-copy\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.517529 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-tmp\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.517619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-run-openvswitch\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.517619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-host-slash\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.517826 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e2838a2-f0a3-4285-86fb-f54be274ccfa-system-cni-dir\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.518044 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.517964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4267203-3686-4c79-a755-afbc3279763c-ovn-node-metrics-cert\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.518559 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.518536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4267203-3686-4c79-a755-afbc3279763c-node-log\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.518712 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.518691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c32688da-6a07-4fb0-a11d-64239ab022f4-serviceca\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.519239 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.519110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-hosts-file\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.519330 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.519279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysctl-conf\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.519609 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.519577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4267203-3686-4c79-a755-afbc3279763c-ovnkube-config\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.519743 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.519718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/477dec0e-139a-477a-85c0-0229e6e2398d-etc-sysconfig\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.520809 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.520788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/477dec0e-139a-477a-85c0-0229e6e2398d-etc-tuned\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.521549 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.521515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4ck\" (UniqueName: \"kubernetes.io/projected/a4267203-3686-4c79-a755-afbc3279763c-kube-api-access-bc4ck\") pod \"ovnkube-node-6dbws\" (UID: \"a4267203-3686-4c79-a755-afbc3279763c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.523154 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.523130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwmd\" (UniqueName: \"kubernetes.io/projected/9e2838a2-f0a3-4285-86fb-f54be274ccfa-kube-api-access-xgwmd\") pod \"multus-xgfzz\" (UID: \"9e2838a2-f0a3-4285-86fb-f54be274ccfa\") " pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.523776 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.523712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a59ddb8d-570d-4782-afa6-1d2e490cf42f-agent-certs\") pod \"konnectivity-agent-znpbx\" (UID: \"a59ddb8d-570d-4782-afa6-1d2e490cf42f\") " pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.525010 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.524986 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:25.525010 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.525008 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:25.525150 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.525021 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:25.525150 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:25.525098 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:26.025065822 +0000 UTC m=+3.122944921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:25.525412 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.525387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbfn\" (UniqueName: \"kubernetes.io/projected/51e9a221-2ee4-44c8-bb3a-29addd9e2fe5-kube-api-access-4cbfn\") pod \"node-resolver-4l9b9\" (UID: \"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5\") " pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.527803 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.527451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4ts\" (UniqueName: \"kubernetes.io/projected/477dec0e-139a-477a-85c0-0229e6e2398d-kube-api-access-vc4ts\") pod \"tuned-8fpkc\" (UID: \"477dec0e-139a-477a-85c0-0229e6e2398d\") " pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.527803 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.527570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzgx\" (UniqueName: \"kubernetes.io/projected/06a3bd25-03ba-42cc-8d7f-5aac5e0fc674-kube-api-access-lvzgx\") pod \"iptables-alerter-9xzz6\" (UID: \"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674\") " pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.527803 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.527645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbv79\" (UniqueName: \"kubernetes.io/projected/0f7cce27-fc9e-437d-9147-a82b82151b07-kube-api-access-pbv79\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:25.528303 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.528285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg5v\" (UniqueName: \"kubernetes.io/projected/810ea4ed-899e-4f5f-908f-30d9aff93364-kube-api-access-lhg5v\") pod \"aws-ebs-csi-driver-node-7fxqg\" (UID: \"810ea4ed-899e-4f5f-908f-30d9aff93364\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:25.529191 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.529169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k852c\" (UniqueName: \"kubernetes.io/projected/c32688da-6a07-4fb0-a11d-64239ab022f4-kube-api-access-k852c\") pod \"node-ca-k2zl5\" (UID: \"c32688da-6a07-4fb0-a11d-64239ab022f4\") " pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.530412 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.530374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrmx\" (UniqueName: \"kubernetes.io/projected/20a28d9b-77a3-42fd-bd13-72cb783d8673-kube-api-access-jtrmx\") pod \"multus-additional-cni-plugins-4l5qj\" (UID: \"20a28d9b-77a3-42fd-bd13-72cb783d8673\") " pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.686051 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.685969 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:25.703514 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.703484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:25.706300 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.706280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9xzz6" Apr 16 16:47:25.714901 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.714885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" Apr 16 16:47:25.719279 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.719253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4l9b9" Apr 16 16:47:25.725780 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.725762 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k2zl5" Apr 16 16:47:25.733369 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.733350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xgfzz" Apr 16 16:47:25.740886 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.740869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" Apr 16 16:47:25.748504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.748482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:25.754036 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:25.754017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" Apr 16 16:47:26.020581 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.020471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:26.020736 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.020643 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:26.020736 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.020712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:27.020691982 +0000 UTC m=+4.118571037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:26.083034 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.082924 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4267203_3686_4c79_a755_afbc3279763c.slice/crio-a8bf425a087c0539fc347bd422cd651be13cc337d158eb301cc3af80430afd50 WatchSource:0}: Error finding container a8bf425a087c0539fc347bd422cd651be13cc337d158eb301cc3af80430afd50: Status 404 returned error can't find the container with id a8bf425a087c0539fc347bd422cd651be13cc337d158eb301cc3af80430afd50 Apr 16 16:47:26.101771 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.101732 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32688da_6a07_4fb0_a11d_64239ab022f4.slice/crio-6f3f5f52a4d391a7cae905ad6367adca7919bec2218b2526cbd81247f232c753 WatchSource:0}: Error finding container 6f3f5f52a4d391a7cae905ad6367adca7919bec2218b2526cbd81247f232c753: Status 404 returned error can't find the container with id 6f3f5f52a4d391a7cae905ad6367adca7919bec2218b2526cbd81247f232c753 Apr 16 16:47:26.104213 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.104186 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda59ddb8d_570d_4782_afa6_1d2e490cf42f.slice/crio-7784b8a0769188ed49c30717e30df91a80717167fff779a7b8b33ca0f8d2f159 WatchSource:0}: Error finding container 7784b8a0769188ed49c30717e30df91a80717167fff779a7b8b33ca0f8d2f159: Status 404 returned error can't find the container with id 7784b8a0769188ed49c30717e30df91a80717167fff779a7b8b33ca0f8d2f159 Apr 16 16:47:26.104970 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.104851 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2838a2_f0a3_4285_86fb_f54be274ccfa.slice/crio-57113233a6e35003fc32bbd57d5a6ed4d41f4d50bc38131f00fa0a98f5c0dc1c WatchSource:0}: Error finding container 57113233a6e35003fc32bbd57d5a6ed4d41f4d50bc38131f00fa0a98f5c0dc1c: Status 404 returned error can't find the container with id 57113233a6e35003fc32bbd57d5a6ed4d41f4d50bc38131f00fa0a98f5c0dc1c Apr 16 16:47:26.109447 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.109427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e9a221_2ee4_44c8_bb3a_29addd9e2fe5.slice/crio-41421c86cbe95a78ed740d6cc2e2448da4c48042ff666c54d723f90631c3cf04 WatchSource:0}: Error finding container 41421c86cbe95a78ed740d6cc2e2448da4c48042ff666c54d723f90631c3cf04: Status 404 returned error can't find the container with id 41421c86cbe95a78ed740d6cc2e2448da4c48042ff666c54d723f90631c3cf04 Apr 16 16:47:26.110313 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.110293 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a3bd25_03ba_42cc_8d7f_5aac5e0fc674.slice/crio-9603adb8c4dc546bd84bcd2881ab3451b0517ab4b4014c0f615e9159ad53e10f WatchSource:0}: Error finding container 9603adb8c4dc546bd84bcd2881ab3451b0517ab4b4014c0f615e9159ad53e10f: Status 404 returned error can't find the container with id 9603adb8c4dc546bd84bcd2881ab3451b0517ab4b4014c0f615e9159ad53e10f Apr 16 16:47:26.111350 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.111312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a28d9b_77a3_42fd_bd13_72cb783d8673.slice/crio-ec3042d0ef72089c6d67b9fc66281ac9b272fef3a00cacbd5fda34f4d1751f78 WatchSource:0}: Error finding container ec3042d0ef72089c6d67b9fc66281ac9b272fef3a00cacbd5fda34f4d1751f78: Status 404 returned error can't find the container with id ec3042d0ef72089c6d67b9fc66281ac9b272fef3a00cacbd5fda34f4d1751f78 Apr 16 16:47:26.112258 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.112231 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477dec0e_139a_477a_85c0_0229e6e2398d.slice/crio-288fc8d211fa9d49de01cdb4644a6d606065cef20ac130fe27df8ca81fecec2a WatchSource:0}: Error finding container 288fc8d211fa9d49de01cdb4644a6d606065cef20ac130fe27df8ca81fecec2a: Status 404 returned error can't find the container with id 288fc8d211fa9d49de01cdb4644a6d606065cef20ac130fe27df8ca81fecec2a Apr 16 16:47:26.113455 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:47:26.113365 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810ea4ed_899e_4f5f_908f_30d9aff93364.slice/crio-7d55fb8630849e99792ccc05d2954fb898a50a38d7fd6e8c7925feffc0f9ef39 WatchSource:0}: Error finding container 7d55fb8630849e99792ccc05d2954fb898a50a38d7fd6e8c7925feffc0f9ef39: Status 404 returned error can't find the container with id 7d55fb8630849e99792ccc05d2954fb898a50a38d7fd6e8c7925feffc0f9ef39 Apr 16 16:47:26.121097 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.121069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:26.121381 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.121214 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:26.121381 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.121236 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:26.121381 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.121249 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:26.121381 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:26.121308 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:27.121291211 +0000 UTC m=+4.219170281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:26.468331 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.468003 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:24 +0000 UTC" deadline="2027-09-30 00:14:47.39049477 +0000 UTC" Apr 16 16:47:26.468331 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.468257 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12751h27m20.922244577s" Apr 16 16:47:26.517160 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.517121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" event={"ID":"b2d196e213c72dc0e82e18661b4a3b77","Type":"ContainerStarted","Data":"4106b7b493aa34451fd27d7f57d5c4290188e23d59e9d8e9254312b4491e8b4f"} Apr 16 16:47:26.525566 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.525532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" event={"ID":"810ea4ed-899e-4f5f-908f-30d9aff93364","Type":"ContainerStarted","Data":"7d55fb8630849e99792ccc05d2954fb898a50a38d7fd6e8c7925feffc0f9ef39"} Apr 16 16:47:26.530533 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.530474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerStarted","Data":"ec3042d0ef72089c6d67b9fc66281ac9b272fef3a00cacbd5fda34f4d1751f78"} Apr 16 16:47:26.533390 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.533326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9xzz6" event={"ID":"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674","Type":"ContainerStarted","Data":"9603adb8c4dc546bd84bcd2881ab3451b0517ab4b4014c0f615e9159ad53e10f"} Apr 16 16:47:26.538476 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.538429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4l9b9" event={"ID":"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5","Type":"ContainerStarted","Data":"41421c86cbe95a78ed740d6cc2e2448da4c48042ff666c54d723f90631c3cf04"} Apr 16 16:47:26.547968 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.546056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-znpbx" event={"ID":"a59ddb8d-570d-4782-afa6-1d2e490cf42f","Type":"ContainerStarted","Data":"7784b8a0769188ed49c30717e30df91a80717167fff779a7b8b33ca0f8d2f159"} Apr 16 16:47:26.553873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.553670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" event={"ID":"477dec0e-139a-477a-85c0-0229e6e2398d","Type":"ContainerStarted","Data":"288fc8d211fa9d49de01cdb4644a6d606065cef20ac130fe27df8ca81fecec2a"} Apr 16 16:47:26.556826 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.556802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xgfzz" event={"ID":"9e2838a2-f0a3-4285-86fb-f54be274ccfa","Type":"ContainerStarted","Data":"57113233a6e35003fc32bbd57d5a6ed4d41f4d50bc38131f00fa0a98f5c0dc1c"} Apr 16 16:47:26.567857 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.567834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k2zl5" event={"ID":"c32688da-6a07-4fb0-a11d-64239ab022f4","Type":"ContainerStarted","Data":"6f3f5f52a4d391a7cae905ad6367adca7919bec2218b2526cbd81247f232c753"} Apr 16 16:47:26.574184 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:26.574154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"a8bf425a087c0539fc347bd422cd651be13cc337d158eb301cc3af80430afd50"} Apr 16 16:47:27.030273 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.030236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:27.030447 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.030372 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:27.030447 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.030434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:29.030416071 +0000 UTC m=+6.128295137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:27.131450 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.131413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:27.131644 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.131601 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:27.131644 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.131620 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:27.131644 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.131633 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:27.131826 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.131690 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:29.131672185 +0000 UTC m=+6.229551248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:27.505969 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.505869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:27.506918 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.506444 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:27.506918 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.506794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:27.506918 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:27.506875 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:27.583425 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.582291 2576 generic.go:358] "Generic (PLEG): container finished" podID="fd85adef90a5128f2b5c6f8eebd4a888" containerID="f02ad1aefbd8891d4bcf220b43ccd84188334a0184ccea53b05afd843a0bb08d" exitCode=0 Apr 16 16:47:27.583425 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.583219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" event={"ID":"fd85adef90a5128f2b5c6f8eebd4a888","Type":"ContainerDied","Data":"f02ad1aefbd8891d4bcf220b43ccd84188334a0184ccea53b05afd843a0bb08d"} Apr 16 16:47:27.597724 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:27.596595 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-1.ec2.internal" podStartSLOduration=3.596579994 podStartE2EDuration="3.596579994s" podCreationTimestamp="2026-04-16 16:47:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:26.535749452 +0000 UTC m=+3.633628528" watchObservedRunningTime="2026-04-16 16:47:27.596579994 +0000 UTC m=+4.694459072" Apr 16 16:47:28.588142 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:28.588104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" event={"ID":"fd85adef90a5128f2b5c6f8eebd4a888","Type":"ContainerStarted","Data":"7dc01ab21ef5a7ada71d3572cbdfd0ab92c5440045640d99e09ceab6b10fc6d8"} Apr 16 16:47:29.053507 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:29.053470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:29.053676 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.053654 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:29.053751 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.053728 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:33.053707164 +0000 UTC m=+10.151586221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:29.154638 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:29.154595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:29.154822 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.154805 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:29.154921 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.154824 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:29.154921 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.154842 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:29.154921 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.154897 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:33.15487963 +0000 UTC m=+10.252758685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:29.505499 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:29.505469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:29.505671 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.505594 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:29.507956 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:29.507788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:29.507956 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:29.507898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:31.504835 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:31.504802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:31.505292 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:31.504932 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:31.505582 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:31.505428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:31.505582 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:31.505546 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:33.083088 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:33.083047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:33.083499 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.083238 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:33.083499 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.083303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:41.083284135 +0000 UTC m=+18.181163201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:33.184523 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:33.184484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:33.184677 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.184644 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:33.184677 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.184663 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:33.184677 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.184675 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:33.184824 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.184731 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:41.184713618 +0000 UTC m=+18.282592687 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:33.506083 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:33.505799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:33.506083 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.505911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:33.506312 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:33.506263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:33.506405 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:33.506370 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:35.505368 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:35.505334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:35.505762 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:35.505346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:35.505762 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:35.505471 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:35.505762 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:35.505551 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:37.505074 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:37.504540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:37.505074 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:37.504659 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:37.505597 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:37.505220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:37.505597 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:37.505326 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:39.504640 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:39.504603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:39.505085 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:39.504606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:39.505085 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:39.504728 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:39.505085 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:39.504801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:41.146719 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:41.146688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:41.147212 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.146799 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:41.147212 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.146856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:57.146841261 +0000 UTC m=+34.244720319 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:41.247147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:41.247114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:41.247319 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.247300 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:41.247395 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.247326 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:41.247395 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.247338 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:41.247490 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.247404 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:57.247385206 +0000 UTC m=+34.345264263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:41.505062 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:41.505031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:41.505241 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:41.505031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:41.505241 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.505182 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:41.505356 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:41.505233 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:43.505984 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:43.505962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:43.506294 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:43.506048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:43.506294 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:43.506129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:43.506294 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:43.506220 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:44.111018 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.110737 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-1.ec2.internal" podStartSLOduration=20.110723268 podStartE2EDuration="20.110723268s" podCreationTimestamp="2026-04-16 16:47:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:28.603372742 +0000 UTC m=+5.701251835" watchObservedRunningTime="2026-04-16 16:47:44.110723268 +0000 UTC m=+21.208602344" Apr 16 16:47:44.111333 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.111320 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cwn6w"] Apr 16 16:47:44.126926 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.126908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.127020 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:44.126992 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:44.270156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.270087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.270156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.270120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-dbus\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.270156 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.270149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-kubelet-config\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.371132 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.371112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-kubelet-config\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.371223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.371158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.371223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.371177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-dbus\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.371336 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.371225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-kubelet-config\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.371336 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:44.371283 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:44.371431 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:44.371352 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret podName:185db826-8d91-4046-9d72-6213e3ded5af nodeName:}" failed. No retries permitted until 2026-04-16 16:47:44.871332476 +0000 UTC m=+21.969211544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret") pod "global-pull-secret-syncer-cwn6w" (UID: "185db826-8d91-4046-9d72-6213e3ded5af") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:44.371431 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.371361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/185db826-8d91-4046-9d72-6213e3ded5af-dbus\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.614093 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.614024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" event={"ID":"810ea4ed-899e-4f5f-908f-30d9aff93364","Type":"ContainerStarted","Data":"212a3dae569a296ddc70b66d80b3407c300c7b180ab78d59a9a8b80374cb2490"} Apr 16 16:47:44.615305 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.615283 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="fc5a8e9c1a50675921be46896fa6451d7b74f1bb23ca2773cf292a2032280de7" exitCode=0 Apr 16 16:47:44.615365 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.615352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"fc5a8e9c1a50675921be46896fa6451d7b74f1bb23ca2773cf292a2032280de7"} Apr 16 16:47:44.616803 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.616696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4l9b9" event={"ID":"51e9a221-2ee4-44c8-bb3a-29addd9e2fe5","Type":"ContainerStarted","Data":"e4c5718323621950d97b6b0554277045100c112ab11d273ccc6347a187213f68"} Apr 16 16:47:44.618172 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.618146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-znpbx" event={"ID":"a59ddb8d-570d-4782-afa6-1d2e490cf42f","Type":"ContainerStarted","Data":"a44f53e94789b67ef3877f1e654ea9d59269a9736d5185ad9c51af404f7638b3"} Apr 16 16:47:44.619739 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.619713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" event={"ID":"477dec0e-139a-477a-85c0-0229e6e2398d","Type":"ContainerStarted","Data":"b16d236a861540ed28be98fe01b6d6cf79c121347485b286434cf5f20c9d7033"} Apr 16 16:47:44.621224 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.621203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xgfzz" event={"ID":"9e2838a2-f0a3-4285-86fb-f54be274ccfa","Type":"ContainerStarted","Data":"5ca2ed84e2ed7fcc00db9d6e14de33cf2a09e168ae9ed17ccd82d591ed025527"} Apr 16 16:47:44.622473 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.622456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k2zl5" event={"ID":"c32688da-6a07-4fb0-a11d-64239ab022f4","Type":"ContainerStarted","Data":"2444e68559e11673519044bcd6348f7f774aa68b4776938402f1193b239bf3c9"} Apr 16 16:47:44.624804 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.624778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:47:44.625159 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625135 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4267203-3686-4c79-a755-afbc3279763c" containerID="e9296ac592c4b4511ef8abf3d6f705bd9944c14b78dedb9d5ff04e843cbfa727" exitCode=1 Apr 16 16:47:44.625223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"bee1053a72e713b5fc5fd6e2824ad3a6c82d492c79fa054880c9cb0d97f5ecf7"} Apr 16 16:47:44.625223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"e20f49155b71bee64efaf35912f28c2aac0d66dff806fc6772fe5a073b3a0b58"} Apr 16 16:47:44.625223 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"bc7bc3b440fa2368e9da474030b6317830f8f761d8d344f74e35a73ceb077e31"} Apr 16 16:47:44.625309 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"4ef266145fbfa4560ddcc407111b16d0c45e377e269b9c1414a0c625dcfb22d8"} Apr 16 16:47:44.625309 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerDied","Data":"e9296ac592c4b4511ef8abf3d6f705bd9944c14b78dedb9d5ff04e843cbfa727"} Apr 16 16:47:44.625309 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.625251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"de41d0b0eea211a6e1d298440e9c9975335f869f16ce739b9ea4bb204e0dc0a4"} Apr 16 16:47:44.647380 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.647340 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4l9b9" podStartSLOduration=4.187669399 podStartE2EDuration="21.647328852s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.111214233 +0000 UTC m=+3.209093288" lastFinishedPulling="2026-04-16 16:47:43.570873684 +0000 UTC m=+20.668752741" observedRunningTime="2026-04-16 16:47:44.647189124 +0000 UTC m=+21.745068199" watchObservedRunningTime="2026-04-16 16:47:44.647328852 +0000 UTC m=+21.745207928" Apr 16 16:47:44.658869 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.658828 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k2zl5" podStartSLOduration=9.09310063 podStartE2EDuration="21.658811816s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.107822595 +0000 UTC m=+3.205701649" lastFinishedPulling="2026-04-16 16:47:38.673533776 +0000 UTC m=+15.771412835" observedRunningTime="2026-04-16 16:47:44.658771058 +0000 UTC m=+21.756650133" watchObservedRunningTime="2026-04-16 16:47:44.658811816 +0000 UTC m=+21.756690896" Apr 16 16:47:44.673255 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.673220 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8fpkc" podStartSLOduration=4.193320296 podStartE2EDuration="21.673206554s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.115307529 +0000 UTC m=+3.213186586" lastFinishedPulling="2026-04-16 16:47:43.595193789 +0000 UTC m=+20.693072844" observedRunningTime="2026-04-16 16:47:44.672812998 +0000 UTC m=+21.770692074" watchObservedRunningTime="2026-04-16 16:47:44.673206554 +0000 UTC m=+21.771085630" Apr 16 16:47:44.691090 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.691047 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xgfzz" podStartSLOduration=4.191984637 podStartE2EDuration="21.6910348s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.10779813 +0000 UTC m=+3.205677198" lastFinishedPulling="2026-04-16 16:47:43.606848293 +0000 UTC m=+20.704727361" observedRunningTime="2026-04-16 16:47:44.690865301 +0000 UTC m=+21.788744378" watchObservedRunningTime="2026-04-16 16:47:44.6910348 +0000 UTC m=+21.788913876" Apr 16 16:47:44.707815 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.707782 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-znpbx" podStartSLOduration=4.221981887 podStartE2EDuration="21.707771728s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.107834023 +0000 UTC m=+3.205713098" lastFinishedPulling="2026-04-16 16:47:43.593623845 +0000 UTC m=+20.691502939" observedRunningTime="2026-04-16 16:47:44.707493657 +0000 UTC m=+21.805372730" watchObservedRunningTime="2026-04-16 16:47:44.707771728 +0000 UTC m=+21.805650803" Apr 16 16:47:44.876935 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:44.876879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:44.877029 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:44.876977 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:44.877029 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:44.877019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret podName:185db826-8d91-4046-9d72-6213e3ded5af nodeName:}" failed. No retries permitted until 2026-04-16 16:47:45.877008279 +0000 UTC m=+22.974887332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret") pod "global-pull-secret-syncer-cwn6w" (UID: "185db826-8d91-4046-9d72-6213e3ded5af") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:45.075114 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.075091 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:47:45.401478 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.401449 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:45.402245 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.402220 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:45.477306 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.477173 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:47:45.075110675Z","UUID":"17f8811d-e6cf-4bd4-9e5e-f3fd2557ae23","Handler":null,"Name":"","Endpoint":""} Apr 16 16:47:45.478863 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.478841 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:47:45.478999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.478873 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:47:45.505162 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.505129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:45.505370 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:45.505345 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:45.505449 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.505365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:45.505486 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:45.505449 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:45.629414 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.629383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" event={"ID":"810ea4ed-899e-4f5f-908f-30d9aff93364","Type":"ContainerStarted","Data":"5e2f50313623ccae875ccdb5d6b80cf450e1ebfb7e27f4e9b40ad8a9c15b6d56"} Apr 16 16:47:45.631850 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.631824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9xzz6" event={"ID":"06a3bd25-03ba-42cc-8d7f-5aac5e0fc674","Type":"ContainerStarted","Data":"ef6619d1bd3cbc20f01cb2372029a657eb65d43d68f1c73c6b720102e13dd6e5"} Apr 16 16:47:45.649728 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.649684 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9xzz6" podStartSLOduration=5.168835004 podStartE2EDuration="22.649671319s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.112426608 +0000 UTC m=+3.210305663" lastFinishedPulling="2026-04-16 16:47:43.59326291 +0000 UTC m=+20.691141978" observedRunningTime="2026-04-16 16:47:45.646815534 +0000 UTC m=+22.744694612" watchObservedRunningTime="2026-04-16 16:47:45.649671319 +0000 UTC m=+22.747550394" Apr 16 16:47:45.884903 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:45.884872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:45.885085 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:45.884995 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:45.885085 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:45.885044 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret podName:185db826-8d91-4046-9d72-6213e3ded5af nodeName:}" failed. No retries permitted until 2026-04-16 16:47:47.885031628 +0000 UTC m=+24.982910683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret") pod "global-pull-secret-syncer-cwn6w" (UID: "185db826-8d91-4046-9d72-6213e3ded5af") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:46.064335 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.064307 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:46.064932 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.064907 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-znpbx" Apr 16 16:47:46.504619 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.504593 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:46.504720 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:46.504686 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:46.636744 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.636722 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:47:46.637383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.637352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"4e14359858a77ed973d02cbc168b43e83e2406acf0281e8f1aaae3cd458ff23a"} Apr 16 16:47:46.639399 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.639374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" event={"ID":"810ea4ed-899e-4f5f-908f-30d9aff93364","Type":"ContainerStarted","Data":"91a4fe6fd4be025eea9a3175ac5117e0b24321d0da53cf04b1b8ba218a4f28e1"} Apr 16 16:47:46.659784 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:46.659732 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fxqg" podStartSLOduration=3.667351948 podStartE2EDuration="23.659715192s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.115346327 +0000 UTC m=+3.213225385" lastFinishedPulling="2026-04-16 16:47:46.10770956 +0000 UTC m=+23.205588629" observedRunningTime="2026-04-16 16:47:46.658968343 +0000 UTC m=+23.756847422" watchObservedRunningTime="2026-04-16 16:47:46.659715192 +0000 UTC m=+23.757594269" Apr 16 16:47:47.505071 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:47.505035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:47.505237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:47.505044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:47.505237 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:47.505160 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:47.505338 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:47.505260 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:47.904796 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:47.904524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:47.904796 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:47.904669 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:47.905285 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:47.904826 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret podName:185db826-8d91-4046-9d72-6213e3ded5af nodeName:}" failed. No retries permitted until 2026-04-16 16:47:51.904808467 +0000 UTC m=+29.002687528 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret") pod "global-pull-secret-syncer-cwn6w" (UID: "185db826-8d91-4046-9d72-6213e3ded5af") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:48.504976 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:48.504928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:48.505137 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:48.505047 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:49.505396 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.505367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:49.506171 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:49.505490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:49.506171 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.505540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:49.506171 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:49.505629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:49.650052 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.650026 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:47:49.650426 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.650393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"3341a25094e6c77f24925cd047a1cc78bdc59177697d55761d6d4ddf1df502be"} Apr 16 16:47:49.650707 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.650682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:49.650796 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.650716 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:49.650874 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.650848 2576 scope.go:117] "RemoveContainer" containerID="e9296ac592c4b4511ef8abf3d6f705bd9944c14b78dedb9d5ff04e843cbfa727" Apr 16 16:47:49.652652 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.652496 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="24ea6a09e6fe70840f89d9deeaee578b30014cb2b9a97d657fea9cc3d5efd3c5" exitCode=0 Apr 16 16:47:49.652738 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.652578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"24ea6a09e6fe70840f89d9deeaee578b30014cb2b9a97d657fea9cc3d5efd3c5"} Apr 16 16:47:49.668060 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:49.668041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:50.505488 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.505471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:50.505747 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:50.505557 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:50.642437 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.642385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cwn6w"] Apr 16 16:47:50.645337 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.645312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2qz6"] Apr 16 16:47:50.645446 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.645429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:50.645550 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:50.645528 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:50.646043 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.646026 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dc7qq"] Apr 16 16:47:50.646141 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.646122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:50.646229 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:50.646207 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:50.656764 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.656739 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="2a05e8e9f69e6e7f2311bb2414e541f61c81e28998b32fbec67cb41ec1c082f2" exitCode=0 Apr 16 16:47:50.656882 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.656815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"2a05e8e9f69e6e7f2311bb2414e541f61c81e28998b32fbec67cb41ec1c082f2"} Apr 16 16:47:50.660447 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.660430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:47:50.660822 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.660802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" event={"ID":"a4267203-3686-4c79-a755-afbc3279763c","Type":"ContainerStarted","Data":"fb676a61b8ef36e18d876bb1899f27eb332fb8567e27aa09122624137a0fd3c4"} Apr 16 16:47:50.660975 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.660809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:50.660975 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:50.660917 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:50.661077 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.661061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:50.679498 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.676484 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:47:50.705654 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:50.705617 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" podStartSLOduration=10.161338115 podStartE2EDuration="27.705607447s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.10087897 +0000 UTC m=+3.198758037" lastFinishedPulling="2026-04-16 16:47:43.645148298 +0000 UTC m=+20.743027369" observedRunningTime="2026-04-16 16:47:50.704046694 +0000 UTC m=+27.801925772" watchObservedRunningTime="2026-04-16 16:47:50.705607447 +0000 UTC m=+27.803486514" Apr 16 16:47:51.664455 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:51.664406 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="48e9f8ddaddb48eb9234efedb8873de6a2b106b1da4d8030063e816db019b257" exitCode=0 Apr 16 16:47:51.664814 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:51.664458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"48e9f8ddaddb48eb9234efedb8873de6a2b106b1da4d8030063e816db019b257"} Apr 16 16:47:51.939348 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:51.939290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:51.939580 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:51.939390 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:51.939580 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:51.939436 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret podName:185db826-8d91-4046-9d72-6213e3ded5af nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.939423757 +0000 UTC m=+37.037302811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret") pod "global-pull-secret-syncer-cwn6w" (UID: "185db826-8d91-4046-9d72-6213e3ded5af") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:52.504572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:52.504536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:52.504572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:52.504581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:52.504797 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:52.504582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:52.504797 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:52.504684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:52.504797 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:52.504737 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:52.504924 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:52.504813 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:54.505146 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:54.505109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:54.505713 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:54.505209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:54.505713 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:54.505222 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:54.505713 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:54.505322 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:54.505713 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:54.505373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:54.505713 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:54.505434 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:56.505510 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:56.505481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:56.505510 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:56.505498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:56.505971 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:56.505571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:56.505971 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:56.505589 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2qz6" podUID="3e57de2a-0cfa-4859-bb21-132d521252f0" Apr 16 16:47:56.505971 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:56.505661 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwn6w" podUID="185db826-8d91-4046-9d72-6213e3ded5af" Apr 16 16:47:56.505971 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:56.505738 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:47:57.174333 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.174301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:57.174479 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.174406 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.174479 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.174452 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.174439193 +0000 UTC m=+66.272318247 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.274723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.274578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:57.274723 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.274720 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:57.274864 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.274738 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:57.274864 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.274748 2576 projected.go:194] Error preparing data for projected volume kube-api-access-6chrj for pod openshift-network-diagnostics/network-check-target-b2qz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.274864 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.274791 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj podName:3e57de2a-0cfa-4859-bb21-132d521252f0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.27477865 +0000 UTC m=+66.372657707 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6chrj" (UniqueName: "kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj") pod "network-check-target-b2qz6" (UID: "3e57de2a-0cfa-4859-bb21-132d521252f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.678425 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.678404 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-1.ec2.internal" event="NodeReady" Apr 16 16:47:57.678754 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.678553 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:47:57.678999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.678971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerStarted","Data":"0eac3c1e0c048362bdc4fe5f2294c9720fc14c394399bd6eee153ecdf265d1a7"} Apr 16 16:47:57.716828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.716774 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-558d79887-xgmcw"] Apr 16 16:47:57.730108 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.730086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.733152 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.733135 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:47:57.733660 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.733644 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zkp2p\"" Apr 16 16:47:57.733845 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.733828 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:47:57.736328 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.736310 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:47:57.751240 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.751223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-558d79887-xgmcw"] Apr 16 16:47:57.759832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.759813 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr"] Apr 16 16:47:57.763584 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.763357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:47:57.784129 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.784110 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rjlc7"] Apr 16 16:47:57.784267 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.784252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.786629 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.786614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:47:57.786712 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.786699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kh49p\"" Apr 16 16:47:57.786779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.786699 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:47:57.809179 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.809155 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2j894"] Apr 16 16:47:57.809293 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.809279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:57.812152 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.812133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:47:57.812244 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.812199 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:47:57.812244 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.812224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:47:57.812328 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.812229 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:47:57.827547 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.827531 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr"] Apr 16 16:47:57.827623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.827551 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rjlc7"] Apr 16 16:47:57.827623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.827559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2j894"] Apr 16 16:47:57.827688 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.827636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2j894" Apr 16 16:47:57.830017 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.829998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:47:57.830100 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.830035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:47:57.830180 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.830162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:47:57.878913 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.878886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a1cc086-1280-418f-b306-f49c2436ad0c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.879016 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.878919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.879016 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.878936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879016 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.878997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879127 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879127 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879127 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2q7\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879235 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879235 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.879235 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.879201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980253 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a1cc086-1280-418f-b306-f49c2436ad0c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.980253 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.980345 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980383 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1997f718-c32f-43e2-8412-29f59bf82303-tmp-dir\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.980401 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.480387431 +0000 UTC m=+35.578266485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.980404 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.980429 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1997f718-c32f-43e2-8412-29f59bf82303-config-volume\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:57.980490 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.480472143 +0000 UTC m=+35.578351206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qctd\" (UniqueName: \"kubernetes.io/projected/1997f718-c32f-43e2-8412-29f59bf82303-kube-api-access-9qctd\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2q7\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkqr\" (UniqueName: \"kubernetes.io/projected/2c428e0a-19c4-4e80-ba25-9b1be39d973e-kube-api-access-wxkqr\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.980779 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.981258 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.981258 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.980964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a1cc086-1280-418f-b306-f49c2436ad0c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:57.981258 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.981060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.984508 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.984492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.984508 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.984501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.988828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.988811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2q7\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.989084 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.989063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:57.994134 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:57.994116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:58.081642 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1997f718-c32f-43e2-8412-29f59bf82303-tmp-dir\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.081703 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1997f718-c32f-43e2-8412-29f59bf82303-config-volume\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.081703 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qctd\" (UniqueName: \"kubernetes.io/projected/1997f718-c32f-43e2-8412-29f59bf82303-kube-api-access-9qctd\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.081785 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkqr\" (UniqueName: \"kubernetes.io/projected/2c428e0a-19c4-4e80-ba25-9b1be39d973e-kube-api-access-wxkqr\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:58.081886 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.081973 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1997f718-c32f-43e2-8412-29f59bf82303-tmp-dir\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.081973 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.081899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:58.082073 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.081975 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:47:58.082073 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.082007 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:47:58.082073 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.082020 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.582004831 +0000 UTC m=+35.679883890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:47:58.082073 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.082051 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:47:58.582036963 +0000 UTC m=+35.679916030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:47:58.082216 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.082147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1997f718-c32f-43e2-8412-29f59bf82303-config-volume\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.094295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.094273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkqr\" (UniqueName: \"kubernetes.io/projected/2c428e0a-19c4-4e80-ba25-9b1be39d973e-kube-api-access-wxkqr\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:58.094793 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.094778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qctd\" (UniqueName: \"kubernetes.io/projected/1997f718-c32f-43e2-8412-29f59bf82303-kube-api-access-9qctd\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.485460 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.485438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:58.485550 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.485466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:58.485598 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.485551 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:47:58.485598 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.485556 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:47:58.485598 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.485566 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:47:58.485598 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.485589 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.48557879 +0000 UTC m=+36.583457844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:47:58.485736 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.485602 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.485595516 +0000 UTC m=+36.583474569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:47:58.505346 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.505326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:58.505400 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.505358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:47:58.505531 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.505519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:47:58.508006 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.507990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:47:58.508089 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.508081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:47:58.509078 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.509059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-km8f8\"" Apr 16 16:47:58.509151 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.509072 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:47:58.509151 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.509081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:47:58.509262 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.509069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:47:58.586258 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.586236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:58.586328 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.586261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:58.586388 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.586329 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:47:58.586388 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.586353 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:47:58.586388 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.586364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.586354371 +0000 UTC m=+36.684233424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:47:58.586490 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:58.586398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:59.586388188 +0000 UTC m=+36.684267246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:47:58.682783 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.682761 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="0eac3c1e0c048362bdc4fe5f2294c9720fc14c394399bd6eee153ecdf265d1a7" exitCode=0 Apr 16 16:47:58.683206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:58.682796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"0eac3c1e0c048362bdc4fe5f2294c9720fc14c394399bd6eee153ecdf265d1a7"} Apr 16 16:47:59.492078 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.492042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:47:59.492242 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.492091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:47:59.492242 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.492181 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:47:59.492242 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.492225 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:47:59.492242 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.492238 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:47:59.492403 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.492255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.492240493 +0000 UTC m=+38.590119546 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:47:59.492403 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.492285 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.492269058 +0000 UTC m=+38.590148132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:47:59.592766 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.592739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:47:59.592900 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.592772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:47:59.592900 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.592858 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:47:59.592900 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.592902 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.592891592 +0000 UTC m=+38.690770645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:47:59.593070 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.592910 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:47:59.593070 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:47:59.592986 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.592970342 +0000 UTC m=+38.690849407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:47:59.688883 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.688854 2576 generic.go:358] "Generic (PLEG): container finished" podID="20a28d9b-77a3-42fd-bd13-72cb783d8673" containerID="64e65212031685f22d2d0b213e4f9fb70a6607c8bd933aa2fda08a2b9ee41c48" exitCode=0 Apr 16 16:47:59.689310 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.688892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerDied","Data":"64e65212031685f22d2d0b213e4f9fb70a6607c8bd933aa2fda08a2b9ee41c48"} Apr 16 16:47:59.995170 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.995143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:47:59.997327 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:47:59.997301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/185db826-8d91-4046-9d72-6213e3ded5af-original-pull-secret\") pod \"global-pull-secret-syncer-cwn6w\" (UID: \"185db826-8d91-4046-9d72-6213e3ded5af\") " pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:48:00.024646 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:00.024602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwn6w" Apr 16 16:48:00.162266 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:00.162096 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cwn6w"] Apr 16 16:48:00.165620 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:48:00.165598 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod185db826_8d91_4046_9d72_6213e3ded5af.slice/crio-cc60c3d2a64866a27adacc3a1db577bd0b277fcc39dc786448e029e598b4cee2 WatchSource:0}: Error finding container cc60c3d2a64866a27adacc3a1db577bd0b277fcc39dc786448e029e598b4cee2: Status 404 returned error can't find the container with id cc60c3d2a64866a27adacc3a1db577bd0b277fcc39dc786448e029e598b4cee2 Apr 16 16:48:00.692420 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:00.692384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cwn6w" event={"ID":"185db826-8d91-4046-9d72-6213e3ded5af","Type":"ContainerStarted","Data":"cc60c3d2a64866a27adacc3a1db577bd0b277fcc39dc786448e029e598b4cee2"} Apr 16 16:48:00.695956 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:00.695918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" event={"ID":"20a28d9b-77a3-42fd-bd13-72cb783d8673","Type":"ContainerStarted","Data":"e3e02c4ae01ea41400bf2ac0abf9ec6acaed8069a9e7d9e3417b28ae8ab0ff87"} Apr 16 16:48:00.719408 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:00.719362 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4l5qj" podStartSLOduration=6.455875809 podStartE2EDuration="37.71934622s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:47:26.113902075 +0000 UTC m=+3.211781144" lastFinishedPulling="2026-04-16 16:47:57.377372489 +0000 UTC m=+34.475251555" observedRunningTime="2026-04-16 16:48:00.717689514 +0000 UTC m=+37.815568600" watchObservedRunningTime="2026-04-16 16:48:00.71934622 +0000 UTC m=+37.817225296" Apr 16 16:48:01.505500 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:01.505467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:48:01.505500 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:01.505505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:48:01.505742 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.505605 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:01.505742 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.505646 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:01.505742 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.505656 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:48:01.505742 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.505672 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.505657474 +0000 UTC m=+42.603536527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:48:01.505742 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.505693 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.505681713 +0000 UTC m=+42.603560766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:48:01.606798 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:01.606769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:48:01.606970 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:01.606804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:48:01.606970 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.606906 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:01.606970 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.606957 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:01.607114 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.606977 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.606960081 +0000 UTC m=+42.704839135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:48:01.607114 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:01.607003 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.606989292 +0000 UTC m=+42.704868367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:48:05.538380 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.538346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:48:05.538380 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.538383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:48:05.538844 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.538476 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:05.538844 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.538485 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:05.538844 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.538494 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:48:05.538844 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.538532 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:48:13.538518821 +0000 UTC m=+50.636397874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:48:05.538844 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.538546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:13.538539417 +0000 UTC m=+50.636418470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:48:05.639060 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.639037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:48:05.639166 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.639130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:48:05.639166 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.639147 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:05.639231 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.639185 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:13.639174073 +0000 UTC m=+50.737053127 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:48:05.639231 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.639196 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:05.639295 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:05.639238 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:13.639227901 +0000 UTC m=+50.737106955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:48:05.706291 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.706270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cwn6w" event={"ID":"185db826-8d91-4046-9d72-6213e3ded5af","Type":"ContainerStarted","Data":"bfb89f2f1d8d03ae9c72e28ca35ffaa6b94a3e78d387ca78ad92fd6a2003aee1"} Apr 16 16:48:05.720385 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:05.720348 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cwn6w" podStartSLOduration=17.226306093 podStartE2EDuration="21.720336647s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:48:00.167451864 +0000 UTC m=+37.265330934" lastFinishedPulling="2026-04-16 16:48:04.661482432 +0000 UTC m=+41.759361488" observedRunningTime="2026-04-16 16:48:05.720113281 +0000 UTC m=+42.817992356" watchObservedRunningTime="2026-04-16 16:48:05.720336647 +0000 UTC m=+42.818215721" Apr 16 16:48:13.591546 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:13.591518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:48:13.591546 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:13.591549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:48:13.591934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.591625 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:13.591934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.591626 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:13.591934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.591634 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:48:13.591934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.591680 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.59166822 +0000 UTC m=+66.689547273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:48:13.591934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.591692 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.591686833 +0000 UTC m=+66.689565886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:48:13.692392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:13.692340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:48:13.692392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:13.692366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:48:13.692530 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.692466 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:13.692530 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.692468 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:13.692530 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.692510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.692494304 +0000 UTC m=+66.790373367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:48:13.692530 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:13.692523 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:29.692517291 +0000 UTC m=+66.790396348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:48:22.677497 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:22.677471 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6dbws" Apr 16 16:48:29.197113 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.197084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:48:29.199839 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.199821 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:29.207905 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.207886 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:48:29.207976 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.207936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:33.207922817 +0000 UTC m=+130.305801870 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : secret "metrics-daemon-secret" not found Apr 16 16:48:29.298455 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.298263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:48:29.301421 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.301401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:29.312083 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.312058 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:29.323672 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.323653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chrj\" (UniqueName: \"kubernetes.io/projected/3e57de2a-0cfa-4859-bb21-132d521252f0-kube-api-access-6chrj\") pod \"network-check-target-b2qz6\" (UID: \"3e57de2a-0cfa-4859-bb21-132d521252f0\") " pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:48:29.417093 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.417072 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-km8f8\"" Apr 16 16:48:29.425234 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.425218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:48:29.531254 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.531221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2qz6"] Apr 16 16:48:29.535596 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:48:29.535560 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e57de2a_0cfa_4859_bb21_132d521252f0.slice/crio-e281bf546729e89eb88cdcc1e47d2c4a2c501aad56fd52d807897685d7ed39f2 WatchSource:0}: Error finding container e281bf546729e89eb88cdcc1e47d2c4a2c501aad56fd52d807897685d7ed39f2: Status 404 returned error can't find the container with id e281bf546729e89eb88cdcc1e47d2c4a2c501aad56fd52d807897685d7ed39f2 Apr 16 16:48:29.600678 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.600650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:48:29.600774 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.600679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:48:29.600832 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.600794 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:29.600832 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.600802 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:48:29.600832 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.600823 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:48:29.600981 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.600846 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:01.600832583 +0000 UTC m=+98.698711636 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:48:29.600981 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.600901 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:49:01.600881719 +0000 UTC m=+98.698760773 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:48:29.701725 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.701700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:48:29.701823 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.701727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:48:29.701883 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.701828 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:29.701883 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.701846 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:29.701883 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.701880 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:49:01.701868475 +0000 UTC m=+98.799747529 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:48:29.702037 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:48:29.701901 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:01.701884896 +0000 UTC m=+98.799763964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:48:29.749511 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:29.749486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2qz6" event={"ID":"3e57de2a-0cfa-4859-bb21-132d521252f0","Type":"ContainerStarted","Data":"e281bf546729e89eb88cdcc1e47d2c4a2c501aad56fd52d807897685d7ed39f2"} Apr 16 16:48:32.756679 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:32.756656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2qz6" event={"ID":"3e57de2a-0cfa-4859-bb21-132d521252f0","Type":"ContainerStarted","Data":"e27eee5a473d8bfb1a0915f5017c2381d495078149a26b84f98f5e0fb2c00083"} Apr 16 16:48:32.757041 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:32.756794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:48:32.771863 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:48:32.771828 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b2qz6" podStartSLOduration=66.759973223 podStartE2EDuration="1m9.77181537s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:48:29.537792304 +0000 UTC m=+66.635671358" lastFinishedPulling="2026-04-16 16:48:32.549634439 +0000 UTC m=+69.647513505" observedRunningTime="2026-04-16 16:48:32.771542193 +0000 UTC m=+69.869421267" watchObservedRunningTime="2026-04-16 16:48:32.77181537 +0000 UTC m=+69.869694444" Apr 16 16:49:01.623503 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:01.623465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:49:01.623503 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:01.623505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:49:01.623934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.623579 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:49:01.623934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.623589 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:49:01.623934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.623590 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:49:01.623934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.623638 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.623626203 +0000 UTC m=+162.721505256 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:49:01.623934 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.623661 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.623644311 +0000 UTC m=+162.721523366 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:49:01.724575 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:01.724545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:49:01.724660 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:01.724581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:49:01.724697 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.724670 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:49:01.724697 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.724670 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:49:01.724764 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.724720 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.724705871 +0000 UTC m=+162.822584924 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:49:01.724764 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:01.724736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.724727989 +0000 UTC m=+162.822607052 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:49:03.759871 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:03.759846 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b2qz6" Apr 16 16:49:33.238397 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:33.238358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:49:33.238824 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:33.238464 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:49:33.238824 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:49:33.238520 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs podName:0f7cce27-fc9e-437d-9147-a82b82151b07 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:35.238506417 +0000 UTC m=+252.336385481 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs") pod "network-metrics-daemon-dc7qq" (UID: "0f7cce27-fc9e-437d-9147-a82b82151b07") : secret "metrics-daemon-secret" not found Apr 16 16:49:58.070787 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.070756 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx"] Apr 16 16:49:58.073504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.073488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.076145 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.076121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:49:58.076145 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.076122 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:49:58.076295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.076157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:49:58.077255 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.077240 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:49:58.077331 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.077244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hfd2n\"" Apr 16 16:49:58.082800 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.082775 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx"] Apr 16 16:49:58.206606 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.206574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.206764 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.206667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.206764 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.206685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv8ft\" (UniqueName: \"kubernetes.io/projected/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-kube-api-access-wv8ft\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.307723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.307695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.307831 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.307731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv8ft\" (UniqueName: \"kubernetes.io/projected/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-kube-api-access-wv8ft\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.307831 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.307767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.308179 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.308161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.309987 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.309968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.316008 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.315989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv8ft\" (UniqueName: \"kubernetes.io/projected/c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe-kube-api-access-wv8ft\") pod \"kube-storage-version-migrator-operator-756bb7d76f-tqhcx\" (UID: \"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.381432 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.381387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" Apr 16 16:49:58.499493 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.499465 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx"] Apr 16 16:49:58.502510 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:49:58.502477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ba38c2_f795_4fd6_9d38_17e6a32e0bbe.slice/crio-9f9785d55c4a36cc2bcf2ffc05e0ff2c8ca0025e1a91e4d74ba40cb9739e321b WatchSource:0}: Error finding container 9f9785d55c4a36cc2bcf2ffc05e0ff2c8ca0025e1a91e4d74ba40cb9739e321b: Status 404 returned error can't find the container with id 9f9785d55c4a36cc2bcf2ffc05e0ff2c8ca0025e1a91e4d74ba40cb9739e321b Apr 16 16:49:58.894110 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:49:58.894081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" event={"ID":"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe","Type":"ContainerStarted","Data":"9f9785d55c4a36cc2bcf2ffc05e0ff2c8ca0025e1a91e4d74ba40cb9739e321b"} Apr 16 16:50:00.738447 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:00.738419 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-558d79887-xgmcw" podUID="2f2cf009-1779-4552-ade1-87d49639f382" Apr 16 16:50:00.791924 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:00.791897 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" podUID="7a1cc086-1280-418f-b306-f49c2436ad0c" Apr 16 16:50:00.817078 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:00.817051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rjlc7" podUID="2c428e0a-19c4-4e80-ba25-9b1be39d973e" Apr 16 16:50:00.834394 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:00.834375 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2j894" podUID="1997f718-c32f-43e2-8412-29f59bf82303" Apr 16 16:50:00.899257 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:00.899235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:50:00.899358 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:00.899269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" event={"ID":"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe","Type":"ContainerStarted","Data":"d80e40753c8b69ab90933d9ff2f3c854845dad1e644e7e3728de648a53071fc1"} Apr 16 16:50:00.899358 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:00.899290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:50:00.899358 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:00.899290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:50:00.919472 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:00.916562 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" podStartSLOduration=1.490117668 podStartE2EDuration="2.916548284s" podCreationTimestamp="2026-04-16 16:49:58 +0000 UTC" firstStartedPulling="2026-04-16 16:49:58.50407165 +0000 UTC m=+155.601950706" lastFinishedPulling="2026-04-16 16:49:59.930502259 +0000 UTC m=+157.028381322" observedRunningTime="2026-04-16 16:50:00.91479967 +0000 UTC m=+158.012678745" watchObservedRunningTime="2026-04-16 16:50:00.916548284 +0000 UTC m=+158.014427359" Apr 16 16:50:01.515337 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.515301 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb"] Apr 16 16:50:01.518136 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.518115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" Apr 16 16:50:01.520353 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:01.520324 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dc7qq" podUID="0f7cce27-fc9e-437d-9147-a82b82151b07" Apr 16 16:50:01.520763 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.520740 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6fv6z\"" Apr 16 16:50:01.521925 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.521906 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:50:01.522042 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.521906 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:50:01.525884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.525863 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb"] Apr 16 16:50:01.631847 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.631818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklrt\" (UniqueName: \"kubernetes.io/projected/b3927737-e108-4def-a580-b80f2c3b48b6-kube-api-access-rklrt\") pod \"migrator-64d4d94569-qlwwb\" (UID: \"b3927737-e108-4def-a580-b80f2c3b48b6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" Apr 16 16:50:01.733040 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.732995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rklrt\" (UniqueName: \"kubernetes.io/projected/b3927737-e108-4def-a580-b80f2c3b48b6-kube-api-access-rklrt\") pod \"migrator-64d4d94569-qlwwb\" (UID: \"b3927737-e108-4def-a580-b80f2c3b48b6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" Apr 16 16:50:01.742049 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.742027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklrt\" (UniqueName: \"kubernetes.io/projected/b3927737-e108-4def-a580-b80f2c3b48b6-kube-api-access-rklrt\") pod \"migrator-64d4d94569-qlwwb\" (UID: \"b3927737-e108-4def-a580-b80f2c3b48b6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" Apr 16 16:50:01.827762 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.827709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" Apr 16 16:50:01.943396 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:01.943370 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb"] Apr 16 16:50:01.946214 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:01.946185 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3927737_e108_4def_a580_b80f2c3b48b6.slice/crio-0a44aba94656c40a2efb77e26d11d08961bee5c9047ff98e0253d83fc5f383ee WatchSource:0}: Error finding container 0a44aba94656c40a2efb77e26d11d08961bee5c9047ff98e0253d83fc5f383ee: Status 404 returned error can't find the container with id 0a44aba94656c40a2efb77e26d11d08961bee5c9047ff98e0253d83fc5f383ee Apr 16 16:50:02.905002 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:02.904912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" event={"ID":"b3927737-e108-4def-a580-b80f2c3b48b6","Type":"ContainerStarted","Data":"0a44aba94656c40a2efb77e26d11d08961bee5c9047ff98e0253d83fc5f383ee"} Apr 16 16:50:03.908450 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:03.908422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" event={"ID":"b3927737-e108-4def-a580-b80f2c3b48b6","Type":"ContainerStarted","Data":"fa07db73cf10fa12389dae67cb4c3cc1da60fcdfe921aa3dfad052fcd68b83fa"} Apr 16 16:50:03.908450 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:03.908451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" event={"ID":"b3927737-e108-4def-a580-b80f2c3b48b6","Type":"ContainerStarted","Data":"63617cbebd81fd4ace367d48c238297e8d7a63771032f174fb5a38d47930b164"} Apr 16 16:50:03.927140 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:03.927089 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qlwwb" podStartSLOduration=1.830404348 podStartE2EDuration="2.927078083s" podCreationTimestamp="2026-04-16 16:50:01 +0000 UTC" firstStartedPulling="2026-04-16 16:50:01.948067004 +0000 UTC m=+159.045946075" lastFinishedPulling="2026-04-16 16:50:03.044740757 +0000 UTC m=+160.142619810" observedRunningTime="2026-04-16 16:50:03.92641034 +0000 UTC m=+161.024289426" watchObservedRunningTime="2026-04-16 16:50:03.927078083 +0000 UTC m=+161.024957157" Apr 16 16:50:04.764309 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:04.764276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4l9b9_51e9a221-2ee4-44c8-bb3a-29addd9e2fe5/dns-node-resolver/0.log" Apr 16 16:50:05.657468 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:05.657440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") pod \"image-registry-558d79887-xgmcw\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:05.657526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.657595 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.657619 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-558d79887-xgmcw: secret "image-registry-tls" not found Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.657598 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.657688 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls podName:2f2cf009-1779-4552-ade1-87d49639f382 nodeName:}" failed. No retries permitted until 2026-04-16 16:52:07.657668479 +0000 UTC m=+284.755547533 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls") pod "image-registry-558d79887-xgmcw" (UID: "2f2cf009-1779-4552-ade1-87d49639f382") : secret "image-registry-tls" not found Apr 16 16:50:05.657872 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.657729 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert podName:7a1cc086-1280-418f-b306-f49c2436ad0c nodeName:}" failed. No retries permitted until 2026-04-16 16:52:07.657719305 +0000 UTC m=+284.755598358 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-4tmzr" (UID: "7a1cc086-1280-418f-b306-f49c2436ad0c") : secret "networking-console-plugin-cert" not found Apr 16 16:50:05.758484 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:05.758455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:50:05.758571 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:05.758489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:50:05.758613 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.758578 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:50:05.758613 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.758578 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:50:05.758723 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.758633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls podName:1997f718-c32f-43e2-8412-29f59bf82303 nodeName:}" failed. No retries permitted until 2026-04-16 16:52:07.758618042 +0000 UTC m=+284.856497101 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls") pod "dns-default-2j894" (UID: "1997f718-c32f-43e2-8412-29f59bf82303") : secret "dns-default-metrics-tls" not found Apr 16 16:50:05.758723 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:05.758647 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert podName:2c428e0a-19c4-4e80-ba25-9b1be39d973e nodeName:}" failed. No retries permitted until 2026-04-16 16:52:07.758640533 +0000 UTC m=+284.856519590 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert") pod "ingress-canary-rjlc7" (UID: "2c428e0a-19c4-4e80-ba25-9b1be39d973e") : secret "canary-serving-cert" not found Apr 16 16:50:06.165583 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:06.165556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k2zl5_c32688da-6a07-4fb0-a11d-64239ab022f4/node-ca/0.log" Apr 16 16:50:07.170273 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:07.170253 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qlwwb_b3927737-e108-4def-a580-b80f2c3b48b6/migrator/0.log" Apr 16 16:50:07.364579 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:07.364557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qlwwb_b3927737-e108-4def-a580-b80f2c3b48b6/graceful-termination/0.log" Apr 16 16:50:07.566513 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:07.566490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-tqhcx_c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe/kube-storage-version-migrator-operator/0.log" Apr 16 16:50:11.504629 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:11.504599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2j894" Apr 16 16:50:15.504775 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:15.504743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:50:25.443784 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.443749 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sqgtm"] Apr 16 16:50:25.446562 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.446546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.449834 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.449814 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-w2v8f\"" Apr 16 16:50:25.450980 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.450963 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:50:25.451102 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.451079 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:50:25.451186 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.451163 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:50:25.451305 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.451170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:50:25.458957 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.458919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sqgtm"] Apr 16 16:50:25.492546 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.492526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/372066d1-c5d3-423a-90bc-f98568ca34a8-crio-socket\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.492647 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.492550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmjt\" (UniqueName: \"kubernetes.io/projected/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-api-access-nsmjt\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.492647 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.492596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/372066d1-c5d3-423a-90bc-f98568ca34a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.492761 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.492667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.492761 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.492715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/372066d1-c5d3-423a-90bc-f98568ca34a8-data-volume\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.593889 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.593869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594018 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.593898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/372066d1-c5d3-423a-90bc-f98568ca34a8-data-volume\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594018 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.593931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/372066d1-c5d3-423a-90bc-f98568ca34a8-crio-socket\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594018 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.593971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmjt\" (UniqueName: \"kubernetes.io/projected/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-api-access-nsmjt\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594159 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.594034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/372066d1-c5d3-423a-90bc-f98568ca34a8-crio-socket\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594159 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.594044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/372066d1-c5d3-423a-90bc-f98568ca34a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594286 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.594268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/372066d1-c5d3-423a-90bc-f98568ca34a8-data-volume\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.594485 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.594466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.596311 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.596292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/372066d1-c5d3-423a-90bc-f98568ca34a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.605264 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.605247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmjt\" (UniqueName: \"kubernetes.io/projected/372066d1-c5d3-423a-90bc-f98568ca34a8-kube-api-access-nsmjt\") pod \"insights-runtime-extractor-sqgtm\" (UID: \"372066d1-c5d3-423a-90bc-f98568ca34a8\") " pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.755615 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.755597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sqgtm" Apr 16 16:50:25.874331 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.874289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sqgtm"] Apr 16 16:50:25.879072 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:25.879044 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod372066d1_c5d3_423a_90bc_f98568ca34a8.slice/crio-45a5581dece336ea30e44e2bb88c0923a678aeb13fb6c43f03d91e6e3997985d WatchSource:0}: Error finding container 45a5581dece336ea30e44e2bb88c0923a678aeb13fb6c43f03d91e6e3997985d: Status 404 returned error can't find the container with id 45a5581dece336ea30e44e2bb88c0923a678aeb13fb6c43f03d91e6e3997985d Apr 16 16:50:25.959470 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.959437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sqgtm" event={"ID":"372066d1-c5d3-423a-90bc-f98568ca34a8","Type":"ContainerStarted","Data":"33d8127fb7261e3e01d7360693133f986d4740152612e156e5904bcadda2d9e3"} Apr 16 16:50:25.959592 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:25.959474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sqgtm" event={"ID":"372066d1-c5d3-423a-90bc-f98568ca34a8","Type":"ContainerStarted","Data":"45a5581dece336ea30e44e2bb88c0923a678aeb13fb6c43f03d91e6e3997985d"} Apr 16 16:50:26.964585 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:26.964548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sqgtm" event={"ID":"372066d1-c5d3-423a-90bc-f98568ca34a8","Type":"ContainerStarted","Data":"e148122f530eb80dcc2e0b38030ee36deeb44548562a5d3ff786568c9aa60512"} Apr 16 16:50:27.968247 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:27.968215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sqgtm" event={"ID":"372066d1-c5d3-423a-90bc-f98568ca34a8","Type":"ContainerStarted","Data":"15c5a6c813c650d6071efe91f2275f519b3be221a329fe753332e0d9071a599f"} Apr 16 16:50:32.152911 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.152863 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sqgtm" podStartSLOduration=5.297884825 podStartE2EDuration="7.152848622s" podCreationTimestamp="2026-04-16 16:50:25 +0000 UTC" firstStartedPulling="2026-04-16 16:50:25.934406755 +0000 UTC m=+183.032285809" lastFinishedPulling="2026-04-16 16:50:27.789370548 +0000 UTC m=+184.887249606" observedRunningTime="2026-04-16 16:50:27.991590316 +0000 UTC m=+185.089469390" watchObservedRunningTime="2026-04-16 16:50:32.152848622 +0000 UTC m=+189.250727697" Apr 16 16:50:32.153341 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.153127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7"] Apr 16 16:50:32.155900 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.155885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:32.159881 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.159862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 16:50:32.159999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.159904 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dfjnd\"" Apr 16 16:50:32.169237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.169218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7"] Apr 16 16:50:32.244168 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.244146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/43b34e2c-035c-4c11-806c-40bf965f3ab2-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-dn2q7\" (UID: \"43b34e2c-035c-4c11-806c-40bf965f3ab2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:32.344973 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.344934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/43b34e2c-035c-4c11-806c-40bf965f3ab2-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-dn2q7\" (UID: \"43b34e2c-035c-4c11-806c-40bf965f3ab2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:32.347322 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.347304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/43b34e2c-035c-4c11-806c-40bf965f3ab2-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-dn2q7\" (UID: \"43b34e2c-035c-4c11-806c-40bf965f3ab2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:32.464305 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.464257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:32.571696 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.571668 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7"] Apr 16 16:50:32.574902 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:32.574877 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b34e2c_035c_4c11_806c_40bf965f3ab2.slice/crio-dcf4c293e4a69a2eac3a689b53c67543c2e539e985d1de09e169697cf8cdfd3b WatchSource:0}: Error finding container dcf4c293e4a69a2eac3a689b53c67543c2e539e985d1de09e169697cf8cdfd3b: Status 404 returned error can't find the container with id dcf4c293e4a69a2eac3a689b53c67543c2e539e985d1de09e169697cf8cdfd3b Apr 16 16:50:32.981167 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:32.981131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" event={"ID":"43b34e2c-035c-4c11-806c-40bf965f3ab2","Type":"ContainerStarted","Data":"dcf4c293e4a69a2eac3a689b53c67543c2e539e985d1de09e169697cf8cdfd3b"} Apr 16 16:50:33.984323 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:33.984288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" event={"ID":"43b34e2c-035c-4c11-806c-40bf965f3ab2","Type":"ContainerStarted","Data":"4d56f56b85d9a1a00bb77852d21141fedeb7ac0b0f5afca1cc2b46fddf348844"} Apr 16 16:50:33.984739 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:33.984494 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:33.988762 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:33.988733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" Apr 16 16:50:34.006555 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:34.006516 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-dn2q7" podStartSLOduration=1.159160377 podStartE2EDuration="2.006504811s" podCreationTimestamp="2026-04-16 16:50:32 +0000 UTC" firstStartedPulling="2026-04-16 16:50:32.577282176 +0000 UTC m=+189.675161229" lastFinishedPulling="2026-04-16 16:50:33.424626609 +0000 UTC m=+190.522505663" observedRunningTime="2026-04-16 16:50:34.006159778 +0000 UTC m=+191.104038852" watchObservedRunningTime="2026-04-16 16:50:34.006504811 +0000 UTC m=+191.104383908" Apr 16 16:50:37.593403 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.593369 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x"] Apr 16 16:50:37.596421 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.596406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.600787 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.600769 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:50:37.601192 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.601169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-4cf4v\"" Apr 16 16:50:37.601254 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.601171 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:50:37.605235 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.605220 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:50:37.605615 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.605602 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:50:37.606286 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.606274 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:50:37.611635 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.611615 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x"] Apr 16 16:50:37.650318 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.650295 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4q5ss"] Apr 16 16:50:37.653318 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.653304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.658318 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.658299 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:50:37.658419 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.658323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:50:37.658419 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.658345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hp9nk\"" Apr 16 16:50:37.659177 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.659160 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:50:37.685415 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.685389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.685575 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.685559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-sys\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.685748 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.685732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.685862 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.685848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8drp\" (UniqueName: \"kubernetes.io/projected/3e5c992b-b27e-4f83-bb52-d788e02e6097-kube-api-access-l8drp\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.685993 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.685980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-root\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686109 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-textfile\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686282 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.686351 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686728 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686793 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-wtmp\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686793 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkg9r\" (UniqueName: \"kubernetes.io/projected/686536bf-e737-40fa-8d47-a76a6dfd56c5-kube-api-access-vkg9r\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686904 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-metrics-client-ca\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.686904 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.686828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e5c992b-b27e-4f83-bb52-d788e02e6097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.787629 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.787723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8drp\" (UniqueName: \"kubernetes.io/projected/3e5c992b-b27e-4f83-bb52-d788e02e6097-kube-api-access-l8drp\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.787723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-root\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.787723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-textfile\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.787723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.787723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-wtmp\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkg9r\" (UniqueName: \"kubernetes.io/projected/686536bf-e737-40fa-8d47-a76a6dfd56c5-kube-api-access-vkg9r\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:37.787791 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-root\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-metrics-client-ca\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:37.787857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls podName:3e5c992b-b27e-4f83-bb52-d788e02e6097 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:38.287839663 +0000 UTC m=+195.385718719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-bqs9x" (UID: "3e5c992b-b27e-4f83-bb52-d788e02e6097") : secret "openshift-state-metrics-tls" not found Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e5c992b-b27e-4f83-bb52-d788e02e6097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-wtmp\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788000 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.788000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-textfile\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788417 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.787986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-sys\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788417 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:37.787882 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:50:37.788417 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.788020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686536bf-e737-40fa-8d47-a76a6dfd56c5-sys\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788417 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:37.788087 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls podName:686536bf-e737-40fa-8d47-a76a6dfd56c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:38.28806164 +0000 UTC m=+195.385940703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls") pod "node-exporter-4q5ss" (UID: "686536bf-e737-40fa-8d47-a76a6dfd56c5") : secret "node-exporter-tls" not found Apr 16 16:50:37.788561 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.788503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788596 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.788579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686536bf-e737-40fa-8d47-a76a6dfd56c5-metrics-client-ca\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.788652 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.788637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e5c992b-b27e-4f83-bb52-d788e02e6097-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.790124 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.790107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:37.790472 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.790455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.796471 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.796453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkg9r\" (UniqueName: \"kubernetes.io/projected/686536bf-e737-40fa-8d47-a76a6dfd56c5-kube-api-access-vkg9r\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:37.796807 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:37.796788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8drp\" (UniqueName: \"kubernetes.io/projected/3e5c992b-b27e-4f83-bb52-d788e02e6097-kube-api-access-l8drp\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:38.292133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.292105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:38.292251 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.292138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:38.294471 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.294443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e5c992b-b27e-4f83-bb52-d788e02e6097-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-bqs9x\" (UID: \"3e5c992b-b27e-4f83-bb52-d788e02e6097\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:38.294471 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.294458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686536bf-e737-40fa-8d47-a76a6dfd56c5-node-exporter-tls\") pod \"node-exporter-4q5ss\" (UID: \"686536bf-e737-40fa-8d47-a76a6dfd56c5\") " pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:38.505337 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.505315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" Apr 16 16:50:38.561792 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.561768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4q5ss" Apr 16 16:50:38.571106 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:38.571079 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686536bf_e737_40fa_8d47_a76a6dfd56c5.slice/crio-15fe1dd87ed686f63c5ffd31e636a46c756eaceb4e38f730ea8d6e0d27ab6bac WatchSource:0}: Error finding container 15fe1dd87ed686f63c5ffd31e636a46c756eaceb4e38f730ea8d6e0d27ab6bac: Status 404 returned error can't find the container with id 15fe1dd87ed686f63c5ffd31e636a46c756eaceb4e38f730ea8d6e0d27ab6bac Apr 16 16:50:38.630781 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.630748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x"] Apr 16 16:50:38.633660 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:38.633638 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e5c992b_b27e_4f83_bb52_d788e02e6097.slice/crio-23de8c7af48769306d215e7460b83e362257087456710a5d26ce0d9af6050b0b WatchSource:0}: Error finding container 23de8c7af48769306d215e7460b83e362257087456710a5d26ce0d9af6050b0b: Status 404 returned error can't find the container with id 23de8c7af48769306d215e7460b83e362257087456710a5d26ce0d9af6050b0b Apr 16 16:50:38.999477 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:38.999425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4q5ss" event={"ID":"686536bf-e737-40fa-8d47-a76a6dfd56c5","Type":"ContainerStarted","Data":"15fe1dd87ed686f63c5ffd31e636a46c756eaceb4e38f730ea8d6e0d27ab6bac"} Apr 16 16:50:39.001303 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.001273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" event={"ID":"3e5c992b-b27e-4f83-bb52-d788e02e6097","Type":"ContainerStarted","Data":"ab76eb9b9300458a923fefad34d50bd36877867383055270057c469c75af11fa"} Apr 16 16:50:39.001303 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.001301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" event={"ID":"3e5c992b-b27e-4f83-bb52-d788e02e6097","Type":"ContainerStarted","Data":"0f0e2859768dda694d5acb82387c0e9f3be0cb400f4ca9d177a2022035ee118b"} Apr 16 16:50:39.001433 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.001317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" event={"ID":"3e5c992b-b27e-4f83-bb52-d788e02e6097","Type":"ContainerStarted","Data":"23de8c7af48769306d215e7460b83e362257087456710a5d26ce0d9af6050b0b"} Apr 16 16:50:39.549539 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.549463 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-58d7f4698b-x74k9"] Apr 16 16:50:39.553087 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.553066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.555995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.555828 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 16:50:39.555995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.555850 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 16:50:39.555995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.555872 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-n4ftc\"" Apr 16 16:50:39.555995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.555891 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 16:50:39.555995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.555889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 16:50:39.556275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.556159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-3j95ae4ekukj4\"" Apr 16 16:50:39.556275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.556175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 16:50:39.564246 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.564223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58d7f4698b-x74k9"] Apr 16 16:50:39.602016 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.601994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwp5\" (UniqueName: \"kubernetes.io/projected/c24cdb38-9b30-40da-a8f8-93f65dd132b9-kube-api-access-xgwp5\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602128 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602128 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602128 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602257 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-grpc-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602257 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24cdb38-9b30-40da-a8f8-93f65dd132b9-metrics-client-ca\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.602449 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.602428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwp5\" (UniqueName: \"kubernetes.io/projected/c24cdb38-9b30-40da-a8f8-93f65dd132b9-kube-api-access-xgwp5\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-grpc-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.703536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.703407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24cdb38-9b30-40da-a8f8-93f65dd132b9-metrics-client-ca\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.704295 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.704270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c24cdb38-9b30-40da-a8f8-93f65dd132b9-metrics-client-ca\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.705730 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.705702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.705879 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.705861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.706187 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.706149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.706292 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.706203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.706292 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.706213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.706605 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.706415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c24cdb38-9b30-40da-a8f8-93f65dd132b9-secret-grpc-tls\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.711284 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.711266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwp5\" (UniqueName: \"kubernetes.io/projected/c24cdb38-9b30-40da-a8f8-93f65dd132b9-kube-api-access-xgwp5\") pod \"thanos-querier-58d7f4698b-x74k9\" (UID: \"c24cdb38-9b30-40da-a8f8-93f65dd132b9\") " pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.863360 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.863306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:39.985824 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:39.985796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58d7f4698b-x74k9"] Apr 16 16:50:39.988834 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:39.988804 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24cdb38_9b30_40da_a8f8_93f65dd132b9.slice/crio-1968b4aacbeca11bcab1cb2cabd9af70f7915f20ffceb48de3801e2a02e55d90 WatchSource:0}: Error finding container 1968b4aacbeca11bcab1cb2cabd9af70f7915f20ffceb48de3801e2a02e55d90: Status 404 returned error can't find the container with id 1968b4aacbeca11bcab1cb2cabd9af70f7915f20ffceb48de3801e2a02e55d90 Apr 16 16:50:40.004858 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:40.004828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"1968b4aacbeca11bcab1cb2cabd9af70f7915f20ffceb48de3801e2a02e55d90"} Apr 16 16:50:40.006087 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:40.006068 2576 generic.go:358] "Generic (PLEG): container finished" podID="686536bf-e737-40fa-8d47-a76a6dfd56c5" containerID="491a884f2fce952fe065f24476b7ed3ac1e2752f2020b6b1a76cca1c2a7723e6" exitCode=0 Apr 16 16:50:40.006176 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:40.006155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4q5ss" event={"ID":"686536bf-e737-40fa-8d47-a76a6dfd56c5","Type":"ContainerDied","Data":"491a884f2fce952fe065f24476b7ed3ac1e2752f2020b6b1a76cca1c2a7723e6"} Apr 16 16:50:40.007872 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:40.007854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" event={"ID":"3e5c992b-b27e-4f83-bb52-d788e02e6097","Type":"ContainerStarted","Data":"059b1d19d96cd172e40d6f558e9b57b1483459128c3279164e4ac5b51d6e3047"} Apr 16 16:50:40.055244 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:40.055199 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-bqs9x" podStartSLOduration=2.117621304 podStartE2EDuration="3.055183321s" podCreationTimestamp="2026-04-16 16:50:37 +0000 UTC" firstStartedPulling="2026-04-16 16:50:38.74167803 +0000 UTC m=+195.839557101" lastFinishedPulling="2026-04-16 16:50:39.679240063 +0000 UTC m=+196.777119118" observedRunningTime="2026-04-16 16:50:40.050277349 +0000 UTC m=+197.148156462" watchObservedRunningTime="2026-04-16 16:50:40.055183321 +0000 UTC m=+197.153062394" Apr 16 16:50:41.012540 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:41.012495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4q5ss" event={"ID":"686536bf-e737-40fa-8d47-a76a6dfd56c5","Type":"ContainerStarted","Data":"6f8e888e1b6f2254d62da7df00d913bcc7755e039e52b16492527652c999e2df"} Apr 16 16:50:41.012540 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:41.012544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4q5ss" event={"ID":"686536bf-e737-40fa-8d47-a76a6dfd56c5","Type":"ContainerStarted","Data":"19312524d3065af91ce72ae9f7cfc92147c4b38a05f01730644d58d16d9d6f30"} Apr 16 16:50:41.034220 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:41.034180 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4q5ss" podStartSLOduration=3.3316565430000002 podStartE2EDuration="4.034166445s" podCreationTimestamp="2026-04-16 16:50:37 +0000 UTC" firstStartedPulling="2026-04-16 16:50:38.573040362 +0000 UTC m=+195.670919415" lastFinishedPulling="2026-04-16 16:50:39.275550256 +0000 UTC m=+196.373429317" observedRunningTime="2026-04-16 16:50:41.033047777 +0000 UTC m=+198.130926852" watchObservedRunningTime="2026-04-16 16:50:41.034166445 +0000 UTC m=+198.132045502" Apr 16 16:50:42.000398 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.000368 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64f884946d-gq5fq"] Apr 16 16:50:42.003382 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.003369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.006053 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.006032 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 16:50:42.007335 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.007306 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:50:42.007335 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.007324 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 16:50:42.007486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.007309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bu7u8vccf6bkq\"" Apr 16 16:50:42.007486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.007309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 16:50:42.007486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.007414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-th6s7\"" Apr 16 16:50:42.016049 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.016022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"31c30397f3d75081b81b00b77e4ea1610ef9b0e5c062c197005954cac4e0ee9a"} Apr 16 16:50:42.016049 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.016050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"43f00de1f7e09877b184c40bdeec8755573ca9ef096f08d2aa343c96993ae42c"} Apr 16 16:50:42.016430 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.016060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"cb214f8c56d5f2684cec80dd8f4cb59c688707132e092f40184c39b5cda5d6ab"} Apr 16 16:50:42.021804 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.021787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64f884946d-gq5fq"] Apr 16 16:50:42.122133 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-client-certs\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122213 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-tls\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122213 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-metrics-server-audit-profiles\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122213 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48r8z\" (UniqueName: \"kubernetes.io/projected/a07c2b8a-d52e-4075-8340-3571cb362506-kube-api-access-48r8z\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122324 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-client-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122324 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a07c2b8a-d52e-4075-8340-3571cb362506-audit-log\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.122430 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.122412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223613 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223729 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-client-certs\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223729 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-tls\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223729 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-metrics-server-audit-profiles\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48r8z\" (UniqueName: \"kubernetes.io/projected/a07c2b8a-d52e-4075-8340-3571cb362506-kube-api-access-48r8z\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-client-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.223875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.223825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a07c2b8a-d52e-4075-8340-3571cb362506-audit-log\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.224432 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.224390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.224534 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.224416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a07c2b8a-d52e-4075-8340-3571cb362506-audit-log\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.225207 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.225185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a07c2b8a-d52e-4075-8340-3571cb362506-metrics-server-audit-profiles\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.226732 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.226703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-client-ca-bundle\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.227087 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.227061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-client-certs\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.227557 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.227534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a07c2b8a-d52e-4075-8340-3571cb362506-secret-metrics-server-tls\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.232157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.232140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48r8z\" (UniqueName: \"kubernetes.io/projected/a07c2b8a-d52e-4075-8340-3571cb362506-kube-api-access-48r8z\") pod \"metrics-server-64f884946d-gq5fq\" (UID: \"a07c2b8a-d52e-4075-8340-3571cb362506\") " pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.312069 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.312002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:50:42.513438 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:42.513388 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64f884946d-gq5fq"] Apr 16 16:50:42.517343 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:42.517319 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda07c2b8a_d52e_4075_8340_3571cb362506.slice/crio-e3ef9a65450b9ad0039e69b6d820be1c2f623bd16114591a18829ee61e982b67 WatchSource:0}: Error finding container e3ef9a65450b9ad0039e69b6d820be1c2f623bd16114591a18829ee61e982b67: Status 404 returned error can't find the container with id e3ef9a65450b9ad0039e69b6d820be1c2f623bd16114591a18829ee61e982b67 Apr 16 16:50:43.022061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.022023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"15f90883afb00bf23397f71f4f70b8f800f8d4b1eda3f49a3fb2cbe631f1b858"} Apr 16 16:50:43.022061 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.022066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"e6adbe569086298a0ca290f2077d72e7ccf09f43063e023f348e98ffc6df9082"} Apr 16 16:50:43.022549 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.022080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" event={"ID":"c24cdb38-9b30-40da-a8f8-93f65dd132b9","Type":"ContainerStarted","Data":"28db0de44fa35b74af24aa9f8ba58fa1ccf2cfa465a9a3e1bd4df8748371b764"} Apr 16 16:50:43.022549 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.022350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:43.023207 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.023182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" event={"ID":"a07c2b8a-d52e-4075-8340-3571cb362506","Type":"ContainerStarted","Data":"e3ef9a65450b9ad0039e69b6d820be1c2f623bd16114591a18829ee61e982b67"} Apr 16 16:50:43.049599 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:43.049556 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" podStartSLOduration=1.592663483 podStartE2EDuration="4.049542513s" podCreationTimestamp="2026-04-16 16:50:39 +0000 UTC" firstStartedPulling="2026-04-16 16:50:39.990885213 +0000 UTC m=+197.088764267" lastFinishedPulling="2026-04-16 16:50:42.447764229 +0000 UTC m=+199.545643297" observedRunningTime="2026-04-16 16:50:43.047256356 +0000 UTC m=+200.145135456" watchObservedRunningTime="2026-04-16 16:50:43.049542513 +0000 UTC m=+200.147421587" Apr 16 16:50:44.027465 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:44.027431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" event={"ID":"a07c2b8a-d52e-4075-8340-3571cb362506","Type":"ContainerStarted","Data":"0cbd2360f8539dad2dc7548c2c0a2836aa12fc701853537311751fa65efcbd8e"} Apr 16 16:50:44.046473 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:44.046428 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" podStartSLOduration=1.92821781 podStartE2EDuration="3.046415429s" podCreationTimestamp="2026-04-16 16:50:41 +0000 UTC" firstStartedPulling="2026-04-16 16:50:42.519381033 +0000 UTC m=+199.617260086" lastFinishedPulling="2026-04-16 16:50:43.637578651 +0000 UTC m=+200.735457705" observedRunningTime="2026-04-16 16:50:44.045184508 +0000 UTC m=+201.143063581" watchObservedRunningTime="2026-04-16 16:50:44.046415429 +0000 UTC m=+201.144294537" Apr 16 16:50:48.435601 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:48.435566 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558d79887-xgmcw"] Apr 16 16:50:48.435983 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:50:48.435720 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-558d79887-xgmcw" podUID="2f2cf009-1779-4552-ade1-87d49639f382" Apr 16 16:50:49.033359 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.033337 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-58d7f4698b-x74k9" Apr 16 16:50:49.040111 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.040090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:50:49.044025 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.044008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:50:49.174455 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174431 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174568 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174463 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174568 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174483 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174568 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174509 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174689 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174689 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd2q7\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174778 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca\") pod \"2f2cf009-1779-4552-ade1-87d49639f382\" (UID: \"2f2cf009-1779-4552-ade1-87d49639f382\") " Apr 16 16:50:49.174778 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.174733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:49.175045 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.175010 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:49.175170 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.175112 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:49.175573 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.175509 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f2cf009-1779-4552-ade1-87d49639f382-ca-trust-extracted\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.175573 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.175537 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-registry-certificates\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.175573 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.175552 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f2cf009-1779-4552-ade1-87d49639f382-trusted-ca\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.176892 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.176872 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:49.177180 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.177136 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7" (OuterVolumeSpecName: "kube-api-access-dd2q7") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "kube-api-access-dd2q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:49.177290 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.177272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:49.177430 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.177412 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2f2cf009-1779-4552-ade1-87d49639f382" (UID: "2f2cf009-1779-4552-ade1-87d49639f382"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:49.276684 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.276652 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dd2q7\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-kube-api-access-dd2q7\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.276684 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.276684 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-image-registry-private-configuration\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.276804 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.276699 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f2cf009-1779-4552-ade1-87d49639f382-installation-pull-secrets\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:49.276804 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:49.276713 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-bound-sa-token\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:50.042216 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:50.042191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-558d79887-xgmcw" Apr 16 16:50:50.118428 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:50.118400 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-558d79887-xgmcw"] Apr 16 16:50:50.124934 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:50.124913 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-558d79887-xgmcw"] Apr 16 16:50:50.183353 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:50.183331 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f2cf009-1779-4552-ade1-87d49639f382-registry-tls\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:50:51.507914 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:51.507882 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2cf009-1779-4552-ade1-87d49639f382" path="/var/lib/kubelet/pods/2f2cf009-1779-4552-ade1-87d49639f382/volumes" Apr 16 16:50:57.110728 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.110691 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-c2zvb"] Apr 16 16:50:57.116175 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.116155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:50:57.118861 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.118836 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:50:57.118961 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.118897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:50:57.118961 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.118846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hrkpt\"" Apr 16 16:50:57.123573 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.123553 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-c2zvb"] Apr 16 16:50:57.232310 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.232284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4pw\" (UniqueName: \"kubernetes.io/projected/b8fcc185-91e6-436f-8394-c2766f17f7a6-kube-api-access-ds4pw\") pod \"downloads-586b57c7b4-c2zvb\" (UID: \"b8fcc185-91e6-436f-8394-c2766f17f7a6\") " pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:50:57.333274 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.333250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4pw\" (UniqueName: \"kubernetes.io/projected/b8fcc185-91e6-436f-8394-c2766f17f7a6-kube-api-access-ds4pw\") pod \"downloads-586b57c7b4-c2zvb\" (UID: \"b8fcc185-91e6-436f-8394-c2766f17f7a6\") " pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:50:57.341436 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.341421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4pw\" (UniqueName: \"kubernetes.io/projected/b8fcc185-91e6-436f-8394-c2766f17f7a6-kube-api-access-ds4pw\") pod \"downloads-586b57c7b4-c2zvb\" (UID: \"b8fcc185-91e6-436f-8394-c2766f17f7a6\") " pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:50:57.424628 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.424573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:50:57.542575 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:57.542557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-c2zvb"] Apr 16 16:50:57.544738 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:50:57.544713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fcc185_91e6_436f_8394_c2766f17f7a6.slice/crio-b508278fcb3be9a1f90517a7ee290eba253661b9aad8c13228c344496118122d WatchSource:0}: Error finding container b508278fcb3be9a1f90517a7ee290eba253661b9aad8c13228c344496118122d: Status 404 returned error can't find the container with id b508278fcb3be9a1f90517a7ee290eba253661b9aad8c13228c344496118122d Apr 16 16:50:58.061538 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:50:58.061507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-c2zvb" event={"ID":"b8fcc185-91e6-436f-8394-c2766f17f7a6","Type":"ContainerStarted","Data":"b508278fcb3be9a1f90517a7ee290eba253661b9aad8c13228c344496118122d"} Apr 16 16:51:02.312190 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.312157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:51:02.312190 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.312197 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:51:02.647749 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.647668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:02.651495 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.651471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.654140 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.654111 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:51:02.654140 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.654128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:51:02.655507 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.655412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:51:02.655507 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.655454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wqqlf\"" Apr 16 16:51:02.655507 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.655479 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:51:02.655802 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.655634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:51:02.663093 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.663072 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:02.778679 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m66vx\" (UniqueName: \"kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.778809 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.778809 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.778879 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.778879 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.778879 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.778865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880216 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880359 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880359 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880359 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880359 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.880559 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.880420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m66vx\" (UniqueName: \"kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.881392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.881327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.881392 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.881327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.881542 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.881406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.883534 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.883511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.883639 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.883536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.891510 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.891486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m66vx\" (UniqueName: \"kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx\") pod \"console-77c468ddc6-2w7dq\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:02.962420 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:02.962351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:03.114609 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:03.114577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:03.119090 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:51:03.119064 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12965720_9640_4fdf_86e7_04e6677b9842.slice/crio-ca84ccd476417a72a783f3136242cd61a86e433013d0caf61a1bf8e05dac03f1 WatchSource:0}: Error finding container ca84ccd476417a72a783f3136242cd61a86e433013d0caf61a1bf8e05dac03f1: Status 404 returned error can't find the container with id ca84ccd476417a72a783f3136242cd61a86e433013d0caf61a1bf8e05dac03f1 Apr 16 16:51:04.083630 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:04.083584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c468ddc6-2w7dq" event={"ID":"12965720-9640-4fdf-86e7-04e6677b9842","Type":"ContainerStarted","Data":"ca84ccd476417a72a783f3136242cd61a86e433013d0caf61a1bf8e05dac03f1"} Apr 16 16:51:06.090137 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:06.090105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c468ddc6-2w7dq" event={"ID":"12965720-9640-4fdf-86e7-04e6677b9842","Type":"ContainerStarted","Data":"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b"} Apr 16 16:51:06.109080 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:06.109024 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77c468ddc6-2w7dq" podStartSLOduration=1.332895775 podStartE2EDuration="4.10900663s" podCreationTimestamp="2026-04-16 16:51:02 +0000 UTC" firstStartedPulling="2026-04-16 16:51:03.121174838 +0000 UTC m=+220.219053892" lastFinishedPulling="2026-04-16 16:51:05.897285686 +0000 UTC m=+222.995164747" observedRunningTime="2026-04-16 16:51:06.107704268 +0000 UTC m=+223.205583367" watchObservedRunningTime="2026-04-16 16:51:06.10900663 +0000 UTC m=+223.206885706" Apr 16 16:51:10.413450 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.413413 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:51:10.418040 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.418014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.425570 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.425544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:51:10.427728 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.427215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:51:10.542820 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.542793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.542996 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.542880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfx6\" (UniqueName: \"kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.542996 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.542910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.542996 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.542927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.543135 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.543042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.543135 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.543087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.543135 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.543120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644418 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644590 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644590 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfx6\" (UniqueName: \"kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644699 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644699 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644798 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.644798 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.644744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.645974 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.645577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.646449 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.646310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.647200 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.647161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.649976 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.649926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.650502 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.650482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.651856 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.651837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.654752 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.654728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfx6\" (UniqueName: \"kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6\") pod \"console-5f87bd6d5d-v7qcg\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:10.730784 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:10.730724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:12.963504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:12.963470 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:12.963934 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:12.963523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:12.968867 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:12.968841 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:13.114206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:13.114179 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:15.674717 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:15.674676 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:51:15.691189 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:51:15.691165 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553e1664_7391_4e99_b297_e6f8ff126408.slice/crio-5e99b1bf75f3e0ca42e5d5ca16a4fd255eff6a28054b5a98d33a05837dbd6bf0 WatchSource:0}: Error finding container 5e99b1bf75f3e0ca42e5d5ca16a4fd255eff6a28054b5a98d33a05837dbd6bf0: Status 404 returned error can't find the container with id 5e99b1bf75f3e0ca42e5d5ca16a4fd255eff6a28054b5a98d33a05837dbd6bf0 Apr 16 16:51:16.120988 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.120925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-c2zvb" event={"ID":"b8fcc185-91e6-436f-8394-c2766f17f7a6","Type":"ContainerStarted","Data":"6a534fd266d60f4d5fa00aa6b50be95f82404ec026bb683d42a02082e4265764"} Apr 16 16:51:16.121205 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.121187 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:51:16.122872 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.122842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87bd6d5d-v7qcg" event={"ID":"553e1664-7391-4e99-b297-e6f8ff126408","Type":"ContainerStarted","Data":"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13"} Apr 16 16:51:16.122999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.122879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87bd6d5d-v7qcg" event={"ID":"553e1664-7391-4e99-b297-e6f8ff126408","Type":"ContainerStarted","Data":"5e99b1bf75f3e0ca42e5d5ca16a4fd255eff6a28054b5a98d33a05837dbd6bf0"} Apr 16 16:51:16.138808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.138781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-c2zvb" Apr 16 16:51:16.139838 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.139800 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-c2zvb" podStartSLOduration=1.036883466 podStartE2EDuration="19.139787846s" podCreationTimestamp="2026-04-16 16:50:57 +0000 UTC" firstStartedPulling="2026-04-16 16:50:57.546416009 +0000 UTC m=+214.644295065" lastFinishedPulling="2026-04-16 16:51:15.649320391 +0000 UTC m=+232.747199445" observedRunningTime="2026-04-16 16:51:16.139373746 +0000 UTC m=+233.237252839" watchObservedRunningTime="2026-04-16 16:51:16.139787846 +0000 UTC m=+233.237666920" Apr 16 16:51:16.179767 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:16.179716 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f87bd6d5d-v7qcg" podStartSLOduration=6.179700048 podStartE2EDuration="6.179700048s" podCreationTimestamp="2026-04-16 16:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:51:16.178303489 +0000 UTC m=+233.276182579" watchObservedRunningTime="2026-04-16 16:51:16.179700048 +0000 UTC m=+233.277579123" Apr 16 16:51:20.730920 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:20.730876 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:20.731458 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:20.731185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:20.737338 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:20.737121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:21.140865 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:21.140837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:51:21.192121 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:21.188697 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:22.318404 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:22.318373 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:51:22.322633 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:22.322599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64f884946d-gq5fq" Apr 16 16:51:35.254934 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:35.254863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:51:35.257120 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:35.257102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f7cce27-fc9e-437d-9147-a82b82151b07-metrics-certs\") pod \"network-metrics-daemon-dc7qq\" (UID: \"0f7cce27-fc9e-437d-9147-a82b82151b07\") " pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:51:35.308857 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:35.308833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:51:35.316200 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:35.316179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dc7qq" Apr 16 16:51:35.429649 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:35.429626 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dc7qq"] Apr 16 16:51:35.432182 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:51:35.432147 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7cce27_fc9e_437d_9147_a82b82151b07.slice/crio-e1e75488ec75fa5f6a96768752a974e48b2096a55bbc4fa66c812e7d21a2b518 WatchSource:0}: Error finding container e1e75488ec75fa5f6a96768752a974e48b2096a55bbc4fa66c812e7d21a2b518: Status 404 returned error can't find the container with id e1e75488ec75fa5f6a96768752a974e48b2096a55bbc4fa66c812e7d21a2b518 Apr 16 16:51:36.181107 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:36.181070 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe" containerID="d80e40753c8b69ab90933d9ff2f3c854845dad1e644e7e3728de648a53071fc1" exitCode=0 Apr 16 16:51:36.181296 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:36.181147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" event={"ID":"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe","Type":"ContainerDied","Data":"d80e40753c8b69ab90933d9ff2f3c854845dad1e644e7e3728de648a53071fc1"} Apr 16 16:51:36.181530 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:36.181508 2576 scope.go:117] "RemoveContainer" containerID="d80e40753c8b69ab90933d9ff2f3c854845dad1e644e7e3728de648a53071fc1" Apr 16 16:51:36.182489 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:36.182466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dc7qq" event={"ID":"0f7cce27-fc9e-437d-9147-a82b82151b07","Type":"ContainerStarted","Data":"e1e75488ec75fa5f6a96768752a974e48b2096a55bbc4fa66c812e7d21a2b518"} Apr 16 16:51:37.186850 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:37.186813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-tqhcx" event={"ID":"c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe","Type":"ContainerStarted","Data":"26af073b581eeca876e1523266793a2fba33920561d1e7895cf421d340fbf088"} Apr 16 16:51:37.188226 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:37.188206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dc7qq" event={"ID":"0f7cce27-fc9e-437d-9147-a82b82151b07","Type":"ContainerStarted","Data":"4a751ba5cb98c39d38e245a8f9d0f721cd648f8a2a6441ad74d29b24635c03d3"} Apr 16 16:51:37.188283 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:37.188231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dc7qq" event={"ID":"0f7cce27-fc9e-437d-9147-a82b82151b07","Type":"ContainerStarted","Data":"17e079e75fadf6e14baea848c3eea1e82e620b24f8d595fa3cce774985c6156a"} Apr 16 16:51:37.217206 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:37.217157 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dc7qq" podStartSLOduration=253.15261126 podStartE2EDuration="4m14.217140786s" podCreationTimestamp="2026-04-16 16:47:23 +0000 UTC" firstStartedPulling="2026-04-16 16:51:35.433815159 +0000 UTC m=+252.531694216" lastFinishedPulling="2026-04-16 16:51:36.498344682 +0000 UTC m=+253.596223742" observedRunningTime="2026-04-16 16:51:37.215631643 +0000 UTC m=+254.313510720" watchObservedRunningTime="2026-04-16 16:51:37.217140786 +0000 UTC m=+254.315019862" Apr 16 16:51:46.213558 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.213497 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77c468ddc6-2w7dq" podUID="12965720-9640-4fdf-86e7-04e6677b9842" containerName="console" containerID="cri-o://4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b" gracePeriod=15 Apr 16 16:51:46.480241 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.480222 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77c468ddc6-2w7dq_12965720-9640-4fdf-86e7-04e6677b9842/console/0.log" Apr 16 16:51:46.480349 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.480280 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:46.532537 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532501 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532665 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532543 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532665 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532586 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532771 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532756 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532827 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532796 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m66vx\" (UniqueName: \"kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532827 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config\") pod \"12965720-9640-4fdf-86e7-04e6677b9842\" (UID: \"12965720-9640-4fdf-86e7-04e6677b9842\") " Apr 16 16:51:46.532925 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532900 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca" (OuterVolumeSpecName: "service-ca") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:51:46.533010 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.532916 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config" (OuterVolumeSpecName: "console-config") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:51:46.533196 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.533163 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:51:46.533316 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.533260 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-console-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:46.533316 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.533281 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-service-ca\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:46.533316 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.533295 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12965720-9640-4fdf-86e7-04e6677b9842-oauth-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:46.534859 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.534840 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:46.534999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.534916 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:46.535065 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.535039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx" (OuterVolumeSpecName: "kube-api-access-m66vx") pod "12965720-9640-4fdf-86e7-04e6677b9842" (UID: "12965720-9640-4fdf-86e7-04e6677b9842"). InnerVolumeSpecName "kube-api-access-m66vx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:46.633740 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.633719 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m66vx\" (UniqueName: \"kubernetes.io/projected/12965720-9640-4fdf-86e7-04e6677b9842-kube-api-access-m66vx\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:46.633740 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.633737 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-oauth-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:46.633856 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:46.633746 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12965720-9640-4fdf-86e7-04e6677b9842-console-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:51:47.217918 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.217899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77c468ddc6-2w7dq_12965720-9640-4fdf-86e7-04e6677b9842/console/0.log" Apr 16 16:51:47.218237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.217931 2576 generic.go:358] "Generic (PLEG): container finished" podID="12965720-9640-4fdf-86e7-04e6677b9842" containerID="4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b" exitCode=2 Apr 16 16:51:47.218237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.218004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c468ddc6-2w7dq" event={"ID":"12965720-9640-4fdf-86e7-04e6677b9842","Type":"ContainerDied","Data":"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b"} Apr 16 16:51:47.218237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.218010 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77c468ddc6-2w7dq" Apr 16 16:51:47.218237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.218030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77c468ddc6-2w7dq" event={"ID":"12965720-9640-4fdf-86e7-04e6677b9842","Type":"ContainerDied","Data":"ca84ccd476417a72a783f3136242cd61a86e433013d0caf61a1bf8e05dac03f1"} Apr 16 16:51:47.218237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.218047 2576 scope.go:117] "RemoveContainer" containerID="4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b" Apr 16 16:51:47.225859 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.225846 2576 scope.go:117] "RemoveContainer" containerID="4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b" Apr 16 16:51:47.226122 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:51:47.226104 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b\": container with ID starting with 4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b not found: ID does not exist" containerID="4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b" Apr 16 16:51:47.226171 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.226128 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b"} err="failed to get container status \"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b\": rpc error: code = NotFound desc = could not find container \"4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b\": container with ID starting with 4c5c4ef481b84af832db56ec2a99d072295b016de85bb4a7dda6bfe1591d179b not found: ID does not exist" Apr 16 16:51:47.236836 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.236812 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:47.242401 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.242380 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77c468ddc6-2w7dq"] Apr 16 16:51:47.507999 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:51:47.507977 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12965720-9640-4fdf-86e7-04e6677b9842" path="/var/lib/kubelet/pods/12965720-9640-4fdf-86e7-04e6677b9842/volumes" Apr 16 16:52:03.900454 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:52:03.900418 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rjlc7" podUID="2c428e0a-19c4-4e80-ba25-9b1be39d973e" Apr 16 16:52:03.900454 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:52:03.900436 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" podUID="7a1cc086-1280-418f-b306-f49c2436ad0c" Apr 16 16:52:04.261841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.261816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:52:04.261841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.261828 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:52:04.665247 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.665179 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:52:04.665510 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.665494 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12965720-9640-4fdf-86e7-04e6677b9842" containerName="console" Apr 16 16:52:04.665580 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.665513 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="12965720-9640-4fdf-86e7-04e6677b9842" containerName="console" Apr 16 16:52:04.665580 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.665575 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="12965720-9640-4fdf-86e7-04e6677b9842" containerName="console" Apr 16 16:52:04.668080 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.668063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.678227 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.678205 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:52:04.756222 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756311 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756311 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756311 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plwgx\" (UniqueName: \"kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756311 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756439 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.756439 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.756354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856593 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856686 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856686 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plwgx\" (UniqueName: \"kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856768 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856825 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856860 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.856981 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.856933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.857363 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.857338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.857528 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.857425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.857637 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.857520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.857697 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.857635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.859140 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.859123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.859248 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.859233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.865105 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.865083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plwgx\" (UniqueName: \"kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx\") pod \"console-f55d5cfc6-nhglb\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:04.976347 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:04.976304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:05.089275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:05.089163 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:52:05.091447 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:52:05.091421 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090c127d_b603_44a0_905a_7739d6be5948.slice/crio-41c1b3891ad1ed3dc110eda5a859abceb7febd83b3eac959f406666d0cc55d0e WatchSource:0}: Error finding container 41c1b3891ad1ed3dc110eda5a859abceb7febd83b3eac959f406666d0cc55d0e: Status 404 returned error can't find the container with id 41c1b3891ad1ed3dc110eda5a859abceb7febd83b3eac959f406666d0cc55d0e Apr 16 16:52:05.265885 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:05.265851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f55d5cfc6-nhglb" event={"ID":"090c127d-b603-44a0-905a-7739d6be5948","Type":"ContainerStarted","Data":"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3"} Apr 16 16:52:05.266045 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:05.265892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f55d5cfc6-nhglb" event={"ID":"090c127d-b603-44a0-905a-7739d6be5948","Type":"ContainerStarted","Data":"41c1b3891ad1ed3dc110eda5a859abceb7febd83b3eac959f406666d0cc55d0e"} Apr 16 16:52:05.287852 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:05.287808 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f55d5cfc6-nhglb" podStartSLOduration=1.287795106 podStartE2EDuration="1.287795106s" podCreationTimestamp="2026-04-16 16:52:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:05.285761255 +0000 UTC m=+282.383640341" watchObservedRunningTime="2026-04-16 16:52:05.287795106 +0000 UTC m=+282.385674178" Apr 16 16:52:07.678154 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.678118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:52:07.680552 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.680516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7a1cc086-1280-418f-b306-f49c2436ad0c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-4tmzr\" (UID: \"7a1cc086-1280-418f-b306-f49c2436ad0c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:52:07.779015 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.778987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:52:07.779101 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.779018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:52:07.781355 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.781335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1997f718-c32f-43e2-8412-29f59bf82303-metrics-tls\") pod \"dns-default-2j894\" (UID: \"1997f718-c32f-43e2-8412-29f59bf82303\") " pod="openshift-dns/dns-default-2j894" Apr 16 16:52:07.781444 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.781358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2c428e0a-19c4-4e80-ba25-9b1be39d973e-cert\") pod \"ingress-canary-rjlc7\" (UID: \"2c428e0a-19c4-4e80-ba25-9b1be39d973e\") " pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:52:07.865093 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.865068 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:52:07.866160 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.866137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kh49p\"" Apr 16 16:52:07.873343 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.873330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rjlc7" Apr 16 16:52:07.873419 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.873406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" Apr 16 16:52:07.907716 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.907693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:52:07.915958 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:07.915923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2j894" Apr 16 16:52:08.009616 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.009576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rjlc7"] Apr 16 16:52:08.012187 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:52:08.012149 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c428e0a_19c4_4e80_ba25_9b1be39d973e.slice/crio-d31e2ba8ee0a5cbcd2615364e38a535921de91b6922b5c01a1f09ae448f5e0ed WatchSource:0}: Error finding container d31e2ba8ee0a5cbcd2615364e38a535921de91b6922b5c01a1f09ae448f5e0ed: Status 404 returned error can't find the container with id d31e2ba8ee0a5cbcd2615364e38a535921de91b6922b5c01a1f09ae448f5e0ed Apr 16 16:52:08.022275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.022255 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr"] Apr 16 16:52:08.025223 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:52:08.025197 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1cc086_1280_418f_b306_f49c2436ad0c.slice/crio-56919f650d36ae9e60932182918986ee289756b237deb393cc0fe4f43892461f WatchSource:0}: Error finding container 56919f650d36ae9e60932182918986ee289756b237deb393cc0fe4f43892461f: Status 404 returned error can't find the container with id 56919f650d36ae9e60932182918986ee289756b237deb393cc0fe4f43892461f Apr 16 16:52:08.054115 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.054063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2j894"] Apr 16 16:52:08.056207 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:52:08.056182 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1997f718_c32f_43e2_8412_29f59bf82303.slice/crio-07c146727c90020cde52676c7ef0b3fc6a9ada1c2d73f5a82639223c9530f029 WatchSource:0}: Error finding container 07c146727c90020cde52676c7ef0b3fc6a9ada1c2d73f5a82639223c9530f029: Status 404 returned error can't find the container with id 07c146727c90020cde52676c7ef0b3fc6a9ada1c2d73f5a82639223c9530f029 Apr 16 16:52:08.274099 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.274071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j894" event={"ID":"1997f718-c32f-43e2-8412-29f59bf82303","Type":"ContainerStarted","Data":"07c146727c90020cde52676c7ef0b3fc6a9ada1c2d73f5a82639223c9530f029"} Apr 16 16:52:08.274967 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.274925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rjlc7" event={"ID":"2c428e0a-19c4-4e80-ba25-9b1be39d973e","Type":"ContainerStarted","Data":"d31e2ba8ee0a5cbcd2615364e38a535921de91b6922b5c01a1f09ae448f5e0ed"} Apr 16 16:52:08.275829 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:08.275814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" event={"ID":"7a1cc086-1280-418f-b306-f49c2436ad0c","Type":"ContainerStarted","Data":"56919f650d36ae9e60932182918986ee289756b237deb393cc0fe4f43892461f"} Apr 16 16:52:10.283691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.283633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j894" event={"ID":"1997f718-c32f-43e2-8412-29f59bf82303","Type":"ContainerStarted","Data":"7082db34a532de909bfbbff036f02af5bd80e76f5c8916d31f0abb430197480a"} Apr 16 16:52:10.283691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.283667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j894" event={"ID":"1997f718-c32f-43e2-8412-29f59bf82303","Type":"ContainerStarted","Data":"5314124061bdc1595cf4a136e98fea55d705efee10a00c60fd13bfa228e6108e"} Apr 16 16:52:10.284155 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.283778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2j894" Apr 16 16:52:10.285189 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.285166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rjlc7" event={"ID":"2c428e0a-19c4-4e80-ba25-9b1be39d973e","Type":"ContainerStarted","Data":"d2aaee8d60d57783fdf7aeb096c734d49de568f0d7f344f789c00001e9bf5d2d"} Apr 16 16:52:10.286514 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.286492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" event={"ID":"7a1cc086-1280-418f-b306-f49c2436ad0c","Type":"ContainerStarted","Data":"605a595b3b82964beca76db82ea24f6175fe28494dd0ace3c36d684f76f28e3b"} Apr 16 16:52:10.301328 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.301271 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2j894" podStartSLOduration=251.32191699 podStartE2EDuration="4m13.301257381s" podCreationTimestamp="2026-04-16 16:47:57 +0000 UTC" firstStartedPulling="2026-04-16 16:52:08.057852192 +0000 UTC m=+285.155731246" lastFinishedPulling="2026-04-16 16:52:10.037192574 +0000 UTC m=+287.135071637" observedRunningTime="2026-04-16 16:52:10.299814149 +0000 UTC m=+287.397693260" watchObservedRunningTime="2026-04-16 16:52:10.301257381 +0000 UTC m=+287.399136448" Apr 16 16:52:10.315394 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.315169 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rjlc7" podStartSLOduration=251.295369174 podStartE2EDuration="4m13.315154976s" podCreationTimestamp="2026-04-16 16:47:57 +0000 UTC" firstStartedPulling="2026-04-16 16:52:08.01432261 +0000 UTC m=+285.112201663" lastFinishedPulling="2026-04-16 16:52:10.03410841 +0000 UTC m=+287.131987465" observedRunningTime="2026-04-16 16:52:10.313729308 +0000 UTC m=+287.411608404" watchObservedRunningTime="2026-04-16 16:52:10.315154976 +0000 UTC m=+287.413034052" Apr 16 16:52:10.329139 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:10.329097 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-4tmzr" podStartSLOduration=264.324057338 podStartE2EDuration="4m26.329081936s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:52:08.027344385 +0000 UTC m=+285.125223439" lastFinishedPulling="2026-04-16 16:52:10.032368972 +0000 UTC m=+287.130248037" observedRunningTime="2026-04-16 16:52:10.329049378 +0000 UTC m=+287.426928452" watchObservedRunningTime="2026-04-16 16:52:10.329081936 +0000 UTC m=+287.426961012" Apr 16 16:52:14.976421 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:14.976387 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:14.976789 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:14.976679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:14.980767 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:14.980745 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:15.305894 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:15.305868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:52:15.352134 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:15.352109 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:52:20.291143 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:20.291116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2j894" Apr 16 16:52:23.422976 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:23.422932 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:52:23.423530 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:23.423506 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:52:23.425748 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:23.425730 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:52:40.370237 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.370181 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f87bd6d5d-v7qcg" podUID="553e1664-7391-4e99-b297-e6f8ff126408" containerName="console" containerID="cri-o://409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13" gracePeriod=15 Apr 16 16:52:40.599353 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.599330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f87bd6d5d-v7qcg_553e1664-7391-4e99-b297-e6f8ff126408/console/0.log" Apr 16 16:52:40.599468 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.599399 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:52:40.696446 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696386 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696446 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696413 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696446 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696430 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hfx6\" (UniqueName: \"kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696454 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696497 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696691 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696541 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert\") pod \"553e1664-7391-4e99-b297-e6f8ff126408\" (UID: \"553e1664-7391-4e99-b297-e6f8ff126408\") " Apr 16 16:52:40.696889 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696838 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:40.696889 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.696854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:40.697030 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.697001 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config" (OuterVolumeSpecName: "console-config") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:40.697072 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.697049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca" (OuterVolumeSpecName: "service-ca") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:40.698549 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.698525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6" (OuterVolumeSpecName: "kube-api-access-4hfx6") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "kube-api-access-4hfx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:40.698679 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.698661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:40.698737 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.698691 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "553e1664-7391-4e99-b297-e6f8ff126408" (UID: "553e1664-7391-4e99-b297-e6f8ff126408"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:40.797442 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797419 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-oauth-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797442 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797439 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/553e1664-7391-4e99-b297-e6f8ff126408-console-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797448 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-trusted-ca-bundle\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797457 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-oauth-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797466 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hfx6\" (UniqueName: \"kubernetes.io/projected/553e1664-7391-4e99-b297-e6f8ff126408-kube-api-access-4hfx6\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797474 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-service-ca\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:40.797553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:40.797483 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/553e1664-7391-4e99-b297-e6f8ff126408-console-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:52:41.372443 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f87bd6d5d-v7qcg_553e1664-7391-4e99-b297-e6f8ff126408/console/0.log" Apr 16 16:52:41.372828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372464 2576 generic.go:358] "Generic (PLEG): container finished" podID="553e1664-7391-4e99-b297-e6f8ff126408" containerID="409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13" exitCode=2 Apr 16 16:52:41.372828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87bd6d5d-v7qcg" event={"ID":"553e1664-7391-4e99-b297-e6f8ff126408","Type":"ContainerDied","Data":"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13"} Apr 16 16:52:41.372828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372544 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87bd6d5d-v7qcg" Apr 16 16:52:41.372828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87bd6d5d-v7qcg" event={"ID":"553e1664-7391-4e99-b297-e6f8ff126408","Type":"ContainerDied","Data":"5e99b1bf75f3e0ca42e5d5ca16a4fd255eff6a28054b5a98d33a05837dbd6bf0"} Apr 16 16:52:41.372828 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.372573 2576 scope.go:117] "RemoveContainer" containerID="409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13" Apr 16 16:52:41.383304 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.383286 2576 scope.go:117] "RemoveContainer" containerID="409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13" Apr 16 16:52:41.383613 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:52:41.383595 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13\": container with ID starting with 409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13 not found: ID does not exist" containerID="409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13" Apr 16 16:52:41.383675 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.383622 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13"} err="failed to get container status \"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13\": rpc error: code = NotFound desc = could not find container \"409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13\": container with ID starting with 409a9d9811d606b67718d7e61911ac655406d32072e848d3fd7a357a6f247e13 not found: ID does not exist" Apr 16 16:52:41.392995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.392972 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:52:41.397271 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.397253 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f87bd6d5d-v7qcg"] Apr 16 16:52:41.508147 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:52:41.508120 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553e1664-7391-4e99-b297-e6f8ff126408" path="/var/lib/kubelet/pods/553e1664-7391-4e99-b297-e6f8ff126408/volumes" Apr 16 16:53:13.569255 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.569183 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:53:13.569650 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.569426 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="553e1664-7391-4e99-b297-e6f8ff126408" containerName="console" Apr 16 16:53:13.569650 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.569436 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="553e1664-7391-4e99-b297-e6f8ff126408" containerName="console" Apr 16 16:53:13.569650 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.569490 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="553e1664-7391-4e99-b297-e6f8ff126408" containerName="console" Apr 16 16:53:13.572137 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.572122 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.587558 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.587536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:53:13.614085 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614181 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22j4h\" (UniqueName: \"kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614282 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614282 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.614282 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.614256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715334 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715334 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.715486 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.715464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22j4h\" (UniqueName: \"kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.716030 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.716009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.716284 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.716268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.716331 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.716314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.716565 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.716540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.717995 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.717976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.718070 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.718018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.723801 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.723782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22j4h\" (UniqueName: \"kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h\") pod \"console-749865d7b4-rc7tq\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.880730 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.880684 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:13.999610 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:13.999576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:53:14.002828 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:53:14.002798 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e382386_4def_4fa5_a98c_75bef20ebf0b.slice/crio-0df10e9bc0ca0179481c281a9fb9e15110008607ba6d2a27d19eef3f7cb8def2 WatchSource:0}: Error finding container 0df10e9bc0ca0179481c281a9fb9e15110008607ba6d2a27d19eef3f7cb8def2: Status 404 returned error can't find the container with id 0df10e9bc0ca0179481c281a9fb9e15110008607ba6d2a27d19eef3f7cb8def2 Apr 16 16:53:14.004851 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:14.004835 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:53:14.468320 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:14.468288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-749865d7b4-rc7tq" event={"ID":"6e382386-4def-4fa5-a98c-75bef20ebf0b","Type":"ContainerStarted","Data":"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13"} Apr 16 16:53:14.468320 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:14.468320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-749865d7b4-rc7tq" event={"ID":"6e382386-4def-4fa5-a98c-75bef20ebf0b","Type":"ContainerStarted","Data":"0df10e9bc0ca0179481c281a9fb9e15110008607ba6d2a27d19eef3f7cb8def2"} Apr 16 16:53:14.484909 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:14.484867 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-749865d7b4-rc7tq" podStartSLOduration=1.484854292 podStartE2EDuration="1.484854292s" podCreationTimestamp="2026-04-16 16:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:53:14.483239746 +0000 UTC m=+351.581118821" watchObservedRunningTime="2026-04-16 16:53:14.484854292 +0000 UTC m=+351.582733366" Apr 16 16:53:23.881242 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:23.881210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:23.881577 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:23.881253 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:23.885685 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:23.885665 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:24.498954 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:24.498915 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:53:24.540458 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:24.540423 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:53:49.559384 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.559346 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f55d5cfc6-nhglb" podUID="090c127d-b603-44a0-905a-7739d6be5948" containerName="console" containerID="cri-o://c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3" gracePeriod=15 Apr 16 16:53:49.786651 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.786632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f55d5cfc6-nhglb_090c127d-b603-44a0-905a-7739d6be5948/console/0.log" Apr 16 16:53:49.786752 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.786689 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:53:49.867393 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867393 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867379 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867412 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plwgx\" (UniqueName: \"kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867447 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867470 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867572 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867485 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert\") pod \"090c127d-b603-44a0-905a-7739d6be5948\" (UID: \"090c127d-b603-44a0-905a-7739d6be5948\") " Apr 16 16:53:49.867904 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867877 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca" (OuterVolumeSpecName: "service-ca") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:53:49.868001 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:53:49.868001 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867912 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:53:49.868001 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.867921 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config" (OuterVolumeSpecName: "console-config") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:53:49.869516 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.869493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:53:49.870079 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.870058 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:53:49.870152 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.870089 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx" (OuterVolumeSpecName: "kube-api-access-plwgx") pod "090c127d-b603-44a0-905a-7739d6be5948" (UID: "090c127d-b603-44a0-905a-7739d6be5948"). InnerVolumeSpecName "kube-api-access-plwgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:53:49.968622 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968598 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968622 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968620 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-plwgx\" (UniqueName: \"kubernetes.io/projected/090c127d-b603-44a0-905a-7739d6be5948-kube-api-access-plwgx\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968630 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-trusted-ca-bundle\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968639 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-console-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968648 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-service-ca\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968655 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090c127d-b603-44a0-905a-7739d6be5948-oauth-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:49.968741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:49.968665 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090c127d-b603-44a0-905a-7739d6be5948-console-oauth-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:53:50.560504 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f55d5cfc6-nhglb_090c127d-b603-44a0-905a-7739d6be5948/console/0.log" Apr 16 16:53:50.560873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560520 2576 generic.go:358] "Generic (PLEG): container finished" podID="090c127d-b603-44a0-905a-7739d6be5948" containerID="c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3" exitCode=2 Apr 16 16:53:50.560873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f55d5cfc6-nhglb" Apr 16 16:53:50.560873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f55d5cfc6-nhglb" event={"ID":"090c127d-b603-44a0-905a-7739d6be5948","Type":"ContainerDied","Data":"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3"} Apr 16 16:53:50.560873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f55d5cfc6-nhglb" event={"ID":"090c127d-b603-44a0-905a-7739d6be5948","Type":"ContainerDied","Data":"41c1b3891ad1ed3dc110eda5a859abceb7febd83b3eac959f406666d0cc55d0e"} Apr 16 16:53:50.560873 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.560652 2576 scope.go:117] "RemoveContainer" containerID="c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3" Apr 16 16:53:50.568783 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.568768 2576 scope.go:117] "RemoveContainer" containerID="c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3" Apr 16 16:53:50.569075 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:53:50.569049 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3\": container with ID starting with c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3 not found: ID does not exist" containerID="c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3" Apr 16 16:53:50.569139 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.569086 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3"} err="failed to get container status \"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3\": rpc error: code = NotFound desc = could not find container \"c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3\": container with ID starting with c06b2d655e40515d311c5641f061e613d18af441bbf1720c3bbf93e6cfd047b3 not found: ID does not exist" Apr 16 16:53:50.585157 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.585135 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:53:50.589339 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:50.589321 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f55d5cfc6-nhglb"] Apr 16 16:53:51.508857 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:53:51.508828 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090c127d-b603-44a0-905a-7739d6be5948" path="/var/lib/kubelet/pods/090c127d-b603-44a0-905a-7739d6be5948/volumes" Apr 16 16:54:14.041465 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.041433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm"] Apr 16 16:54:14.041875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.041700 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="090c127d-b603-44a0-905a-7739d6be5948" containerName="console" Apr 16 16:54:14.041875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.041710 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="090c127d-b603-44a0-905a-7739d6be5948" containerName="console" Apr 16 16:54:14.041875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.041758 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="090c127d-b603-44a0-905a-7739d6be5948" containerName="console" Apr 16 16:54:14.044434 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.044420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.048700 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.048677 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:54:14.048841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.048698 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:54:14.048841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.048708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-sqc6p\"" Apr 16 16:54:14.048841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.048679 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:54:14.048841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.048679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:54:14.053690 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.053670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm"] Apr 16 16:54:14.124918 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.124889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4pfb\" (UniqueName: \"kubernetes.io/projected/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-kube-api-access-t4pfb\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.125047 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.124962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.150343 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.150317 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff"] Apr 16 16:54:14.154424 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.154398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.157846 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.157824 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:54:14.158149 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.158133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:54:14.158553 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.158537 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:54:14.158636 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.158561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:54:14.165806 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.165786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff"] Apr 16 16:54:14.225697 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225771 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225771 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225771 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225771 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4pfb\" (UniqueName: \"kubernetes.io/projected/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-kube-api-access-t4pfb\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.225928 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/75ff8df7-344f-4b0a-9028-97acaa4f0668-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225928 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdsf7\" (UniqueName: \"kubernetes.io/projected/75ff8df7-344f-4b0a-9028-97acaa4f0668-kube-api-access-mdsf7\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.225928 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.225864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.228321 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.228301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.233409 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.233384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4pfb\" (UniqueName: \"kubernetes.io/projected/a1744e7b-b60f-4273-aa2c-23c7e1d973a3-kube-api-access-t4pfb\") pod \"managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm\" (UID: \"a1744e7b-b60f-4273-aa2c-23c7e1d973a3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.327255 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.327255 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/75ff8df7-344f-4b0a-9028-97acaa4f0668-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.327370 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdsf7\" (UniqueName: \"kubernetes.io/projected/75ff8df7-344f-4b0a-9028-97acaa4f0668-kube-api-access-mdsf7\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.327370 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.327435 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.327473 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.327440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.328124 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.328063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/75ff8df7-344f-4b0a-9028-97acaa4f0668-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.329603 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.329579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-ca\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.329725 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.329707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.329780 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.329764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.329832 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.329817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/75ff8df7-344f-4b0a-9028-97acaa4f0668-hub\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.335141 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.335121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdsf7\" (UniqueName: \"kubernetes.io/projected/75ff8df7-344f-4b0a-9028-97acaa4f0668-kube-api-access-mdsf7\") pod \"cluster-proxy-proxy-agent-67b9d9f88b-65xff\" (UID: \"75ff8df7-344f-4b0a-9028-97acaa4f0668\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.362692 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.362670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" Apr 16 16:54:14.462889 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.462865 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" Apr 16 16:54:14.497686 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.497625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm"] Apr 16 16:54:14.501077 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:54:14.501048 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1744e7b_b60f_4273_aa2c_23c7e1d973a3.slice/crio-0fb67593e6c4cb24f75a8abb05923f14613c0987ef3e9fc0a2fbb9daa485468c WatchSource:0}: Error finding container 0fb67593e6c4cb24f75a8abb05923f14613c0987ef3e9fc0a2fbb9daa485468c: Status 404 returned error can't find the container with id 0fb67593e6c4cb24f75a8abb05923f14613c0987ef3e9fc0a2fbb9daa485468c Apr 16 16:54:14.581881 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.581856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff"] Apr 16 16:54:14.584097 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:54:14.584072 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ff8df7_344f_4b0a_9028_97acaa4f0668.slice/crio-526dc953a06d3501a6ffe3d4196030843cd4ae16981a069941022dea7c4c92bc WatchSource:0}: Error finding container 526dc953a06d3501a6ffe3d4196030843cd4ae16981a069941022dea7c4c92bc: Status 404 returned error can't find the container with id 526dc953a06d3501a6ffe3d4196030843cd4ae16981a069941022dea7c4c92bc Apr 16 16:54:14.622709 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.622680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" event={"ID":"75ff8df7-344f-4b0a-9028-97acaa4f0668","Type":"ContainerStarted","Data":"526dc953a06d3501a6ffe3d4196030843cd4ae16981a069941022dea7c4c92bc"} Apr 16 16:54:14.623592 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:14.623572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" event={"ID":"a1744e7b-b60f-4273-aa2c-23c7e1d973a3","Type":"ContainerStarted","Data":"0fb67593e6c4cb24f75a8abb05923f14613c0987ef3e9fc0a2fbb9daa485468c"} Apr 16 16:54:17.635194 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:17.634845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" event={"ID":"75ff8df7-344f-4b0a-9028-97acaa4f0668","Type":"ContainerStarted","Data":"a9cd6a60d646a7e3d17772480bcbb1a3b4cd3068b7acc39970ac846dcff2ff0a"} Apr 16 16:54:17.636527 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:17.636495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" event={"ID":"a1744e7b-b60f-4273-aa2c-23c7e1d973a3","Type":"ContainerStarted","Data":"72e448a12d9bdc1e201b4741d29c3476385e739ea20d877de8b9b9f05df658e1"} Apr 16 16:54:17.653735 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:17.653690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-58dc4f8d67-h8pgm" podStartSLOduration=0.623942958 podStartE2EDuration="3.653676717s" podCreationTimestamp="2026-04-16 16:54:14 +0000 UTC" firstStartedPulling="2026-04-16 16:54:14.502703132 +0000 UTC m=+411.600582186" lastFinishedPulling="2026-04-16 16:54:17.532436891 +0000 UTC m=+414.630315945" observedRunningTime="2026-04-16 16:54:17.652328708 +0000 UTC m=+414.750207782" watchObservedRunningTime="2026-04-16 16:54:17.653676717 +0000 UTC m=+414.751555791" Apr 16 16:54:19.643606 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:19.643574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" event={"ID":"75ff8df7-344f-4b0a-9028-97acaa4f0668","Type":"ContainerStarted","Data":"c5aba2a74d62c260d85163b588d9ab2ebdeb87c3e2c3398f4f6b91a0c6063d34"} Apr 16 16:54:19.643606 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:19.643606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" event={"ID":"75ff8df7-344f-4b0a-9028-97acaa4f0668","Type":"ContainerStarted","Data":"e147d3cfc89ec2c08a947ddb4db36b507fedaa1c8d22a6e675da236c578cbc48"} Apr 16 16:54:19.660254 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:54:19.660211 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67b9d9f88b-65xff" podStartSLOduration=0.827962351 podStartE2EDuration="5.660197423s" podCreationTimestamp="2026-04-16 16:54:14 +0000 UTC" firstStartedPulling="2026-04-16 16:54:14.585925026 +0000 UTC m=+411.683804083" lastFinishedPulling="2026-04-16 16:54:19.418160098 +0000 UTC m=+416.516039155" observedRunningTime="2026-04-16 16:54:19.659870226 +0000 UTC m=+416.757749323" watchObservedRunningTime="2026-04-16 16:54:19.660197423 +0000 UTC m=+416.758076497" Apr 16 16:55:55.532798 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.532768 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-vzt88"] Apr 16 16:55:55.535723 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.535704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.538539 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.538516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-tvxrm\"" Apr 16 16:55:55.538799 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.538782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:55:55.539660 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.539644 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:55:55.539740 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.539668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:55:55.544094 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.544073 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vzt88"] Apr 16 16:55:55.613417 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.613386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kfj\" (UniqueName: \"kubernetes.io/projected/115ba5ea-a060-4e01-8199-61d6751e7468-kube-api-access-z4kfj\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.613513 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.613448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/115ba5ea-a060-4e01-8199-61d6751e7468-data\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.714468 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.714445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/115ba5ea-a060-4e01-8199-61d6751e7468-data\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.714548 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.714497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kfj\" (UniqueName: \"kubernetes.io/projected/115ba5ea-a060-4e01-8199-61d6751e7468-kube-api-access-z4kfj\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.714787 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.714771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/115ba5ea-a060-4e01-8199-61d6751e7468-data\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.722412 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.722388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kfj\" (UniqueName: \"kubernetes.io/projected/115ba5ea-a060-4e01-8199-61d6751e7468-kube-api-access-z4kfj\") pod \"seaweedfs-86cc847c5c-vzt88\" (UID: \"115ba5ea-a060-4e01-8199-61d6751e7468\") " pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.845321 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.845262 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:55.963438 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:55.963413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vzt88"] Apr 16 16:55:55.965670 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:55:55.965639 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115ba5ea_a060_4e01_8199_61d6751e7468.slice/crio-0f464215f9d20b21d22f0ef8555192946be815568e56d3bf0b045046d4ada9e1 WatchSource:0}: Error finding container 0f464215f9d20b21d22f0ef8555192946be815568e56d3bf0b045046d4ada9e1: Status 404 returned error can't find the container with id 0f464215f9d20b21d22f0ef8555192946be815568e56d3bf0b045046d4ada9e1 Apr 16 16:55:56.896822 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:56.896779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vzt88" event={"ID":"115ba5ea-a060-4e01-8199-61d6751e7468","Type":"ContainerStarted","Data":"0f464215f9d20b21d22f0ef8555192946be815568e56d3bf0b045046d4ada9e1"} Apr 16 16:55:58.904142 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:58.904110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vzt88" event={"ID":"115ba5ea-a060-4e01-8199-61d6751e7468","Type":"ContainerStarted","Data":"6f2431a73079dd92a80007c3deea530b1097aadb49d12b571c01cdaf1ed013fd"} Apr 16 16:55:58.904494 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:58.904169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:55:58.920625 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:55:58.920581 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-vzt88" podStartSLOduration=1.4567278799999999 podStartE2EDuration="3.920568802s" podCreationTimestamp="2026-04-16 16:55:55 +0000 UTC" firstStartedPulling="2026-04-16 16:55:55.966918432 +0000 UTC m=+513.064797486" lastFinishedPulling="2026-04-16 16:55:58.430759351 +0000 UTC m=+515.528638408" observedRunningTime="2026-04-16 16:55:58.918884833 +0000 UTC m=+516.016763908" watchObservedRunningTime="2026-04-16 16:55:58.920568802 +0000 UTC m=+516.018447879" Apr 16 16:56:04.908743 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:56:04.908675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-vzt88" Apr 16 16:57:04.224537 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.224508 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-2kchp"] Apr 16 16:57:04.227371 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.227356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.230182 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.230163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:57:04.230317 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.230228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-pqzvv\"" Apr 16 16:57:04.236764 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.236738 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2kchp"] Apr 16 16:57:04.377898 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.377874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7532813-1d19-4981-890a-56b49e813729-cert\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.378063 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.377918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrcm4\" (UniqueName: \"kubernetes.io/projected/f7532813-1d19-4981-890a-56b49e813729-kube-api-access-lrcm4\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.478596 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.478526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7532813-1d19-4981-890a-56b49e813729-cert\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.478596 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.478566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrcm4\" (UniqueName: \"kubernetes.io/projected/f7532813-1d19-4981-890a-56b49e813729-kube-api-access-lrcm4\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.481357 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.481332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7532813-1d19-4981-890a-56b49e813729-cert\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.486427 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.486404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrcm4\" (UniqueName: \"kubernetes.io/projected/f7532813-1d19-4981-890a-56b49e813729-kube-api-access-lrcm4\") pod \"odh-model-controller-696fc77849-2kchp\" (UID: \"f7532813-1d19-4981-890a-56b49e813729\") " pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.538333 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.538310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:04.660207 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:04.660176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2kchp"] Apr 16 16:57:04.662457 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:57:04.662427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7532813_1d19_4981_890a_56b49e813729.slice/crio-e71b3698c4be16c56c6c47a109ed48f4d6effe9b6e4fcab236c777a331b5c5ff WatchSource:0}: Error finding container e71b3698c4be16c56c6c47a109ed48f4d6effe9b6e4fcab236c777a331b5c5ff: Status 404 returned error can't find the container with id e71b3698c4be16c56c6c47a109ed48f4d6effe9b6e4fcab236c777a331b5c5ff Apr 16 16:57:05.082039 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:05.082004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2kchp" event={"ID":"f7532813-1d19-4981-890a-56b49e813729","Type":"ContainerStarted","Data":"e71b3698c4be16c56c6c47a109ed48f4d6effe9b6e4fcab236c777a331b5c5ff"} Apr 16 16:57:08.092797 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.092758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2kchp" event={"ID":"f7532813-1d19-4981-890a-56b49e813729","Type":"ContainerStarted","Data":"fa8d2c0b4eae2d41324a431867464ee47a23ee7880e9466e489fe242a3cfdefa"} Apr 16 16:57:08.093217 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.092908 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:08.112574 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.112522 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-2kchp" podStartSLOduration=1.653278975 podStartE2EDuration="4.112504303s" podCreationTimestamp="2026-04-16 16:57:04 +0000 UTC" firstStartedPulling="2026-04-16 16:57:04.663733733 +0000 UTC m=+581.761612801" lastFinishedPulling="2026-04-16 16:57:07.122959073 +0000 UTC m=+584.220838129" observedRunningTime="2026-04-16 16:57:08.111125299 +0000 UTC m=+585.209004376" watchObservedRunningTime="2026-04-16 16:57:08.112504303 +0000 UTC m=+585.210383379" Apr 16 16:57:08.507542 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.507507 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b9c4cb5d-8h8t8"] Apr 16 16:57:08.510833 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.510815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.522612 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.522574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b9c4cb5d-8h8t8"] Apr 16 16:57:08.612544 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612735 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-oauth-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612735 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-service-ca\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612735 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612735 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-trusted-ca-bundle\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612930 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-oauth-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.612930 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.612780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p8hp\" (UniqueName: \"kubernetes.io/projected/d409e915-8e1f-44de-ad55-ad3b405d0ae9-kube-api-access-9p8hp\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713585 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713585 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-oauth-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-service-ca\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-trusted-ca-bundle\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-oauth-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.713808 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.713719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p8hp\" (UniqueName: \"kubernetes.io/projected/d409e915-8e1f-44de-ad55-ad3b405d0ae9-kube-api-access-9p8hp\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.714436 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.714406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.714605 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.714508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-oauth-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.714605 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.714598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-service-ca\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.714806 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.714787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d409e915-8e1f-44de-ad55-ad3b405d0ae9-trusted-ca-bundle\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.717116 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.717091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-oauth-config\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.717215 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.717157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d409e915-8e1f-44de-ad55-ad3b405d0ae9-console-serving-cert\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.722198 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.722176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p8hp\" (UniqueName: \"kubernetes.io/projected/d409e915-8e1f-44de-ad55-ad3b405d0ae9-kube-api-access-9p8hp\") pod \"console-5b9c4cb5d-8h8t8\" (UID: \"d409e915-8e1f-44de-ad55-ad3b405d0ae9\") " pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.821563 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.821495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:08.946003 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:08.945923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b9c4cb5d-8h8t8"] Apr 16 16:57:08.948262 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:57:08.948230 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd409e915_8e1f_44de_ad55_ad3b405d0ae9.slice/crio-3582b1f7f57528a63d40df54721808f29051dbf8425277b94b300f72289eabe5 WatchSource:0}: Error finding container 3582b1f7f57528a63d40df54721808f29051dbf8425277b94b300f72289eabe5: Status 404 returned error can't find the container with id 3582b1f7f57528a63d40df54721808f29051dbf8425277b94b300f72289eabe5 Apr 16 16:57:09.097171 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:09.097074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b9c4cb5d-8h8t8" event={"ID":"d409e915-8e1f-44de-ad55-ad3b405d0ae9","Type":"ContainerStarted","Data":"8af0bdc3832c4ab52e03206a9fc19e7b618b7ed9d84f0ca73a4bae78a565f81e"} Apr 16 16:57:09.097171 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:09.097127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b9c4cb5d-8h8t8" event={"ID":"d409e915-8e1f-44de-ad55-ad3b405d0ae9","Type":"ContainerStarted","Data":"3582b1f7f57528a63d40df54721808f29051dbf8425277b94b300f72289eabe5"} Apr 16 16:57:09.114570 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:09.114521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b9c4cb5d-8h8t8" podStartSLOduration=1.114508313 podStartE2EDuration="1.114508313s" podCreationTimestamp="2026-04-16 16:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:57:09.11274896 +0000 UTC m=+586.210628037" watchObservedRunningTime="2026-04-16 16:57:09.114508313 +0000 UTC m=+586.212387390" Apr 16 16:57:18.822006 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:18.821970 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:18.822006 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:18.822016 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:18.826592 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:18.826569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:19.099531 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:19.099463 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-2kchp" Apr 16 16:57:19.133164 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:19.133138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b9c4cb5d-8h8t8" Apr 16 16:57:19.183466 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:19.183434 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:57:23.444431 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:23.444404 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:57:23.444840 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:23.444536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 16:57:39.418800 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.418759 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw"] Apr 16 16:57:39.422741 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.422721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:57:39.425227 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.425209 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bgqcs\"" Apr 16 16:57:39.428305 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.428286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw"] Apr 16 16:57:39.435888 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.435866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3bcf0d4-3673-4f6b-a638-04809c5e5256-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw\" (UID: \"e3bcf0d4-3673-4f6b-a638-04809c5e5256\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:57:39.536428 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.536398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3bcf0d4-3673-4f6b-a638-04809c5e5256-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw\" (UID: \"e3bcf0d4-3673-4f6b-a638-04809c5e5256\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:57:39.536780 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.536761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3bcf0d4-3673-4f6b-a638-04809c5e5256-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw\" (UID: \"e3bcf0d4-3673-4f6b-a638-04809c5e5256\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:57:39.735004 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.734976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:57:39.860788 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:39.860764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw"] Apr 16 16:57:39.863343 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:57:39.863313 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bcf0d4_3673_4f6b_a638_04809c5e5256.slice/crio-1da4787801311c2ce30f727983698f2f907f88933e0be4cdea305f381455d202 WatchSource:0}: Error finding container 1da4787801311c2ce30f727983698f2f907f88933e0be4cdea305f381455d202: Status 404 returned error can't find the container with id 1da4787801311c2ce30f727983698f2f907f88933e0be4cdea305f381455d202 Apr 16 16:57:40.198930 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:40.198847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" event={"ID":"e3bcf0d4-3673-4f6b-a638-04809c5e5256","Type":"ContainerStarted","Data":"1da4787801311c2ce30f727983698f2f907f88933e0be4cdea305f381455d202"} Apr 16 16:57:44.204966 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.204897 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-749865d7b4-rc7tq" podUID="6e382386-4def-4fa5-a98c-75bef20ebf0b" containerName="console" containerID="cri-o://e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13" gracePeriod=15 Apr 16 16:57:44.212740 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.212708 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" event={"ID":"e3bcf0d4-3673-4f6b-a638-04809c5e5256","Type":"ContainerStarted","Data":"3009ebc4842e6cb42414efd3e222f1944b7fb0fa458f1c5cb868facdb47eb86a"} Apr 16 16:57:44.445695 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.445671 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-749865d7b4-rc7tq_6e382386-4def-4fa5-a98c-75bef20ebf0b/console/0.log" Apr 16 16:57:44.445803 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.445734 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:57:44.479457 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479381 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479589 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479455 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479589 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479485 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479589 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479517 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479747 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479597 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479747 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22j4h\" (UniqueName: \"kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479747 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479669 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config\") pod \"6e382386-4def-4fa5-a98c-75bef20ebf0b\" (UID: \"6e382386-4def-4fa5-a98c-75bef20ebf0b\") " Apr 16 16:57:44.479897 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config" (OuterVolumeSpecName: "console-config") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:57:44.479897 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479872 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca" (OuterVolumeSpecName: "service-ca") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:57:44.479897 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479880 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:57:44.480076 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:57:44.480076 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.479915 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.481830 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.481804 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:57:44.482017 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.481988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:57:44.482176 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.482153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h" (OuterVolumeSpecName: "kube-api-access-22j4h") pod "6e382386-4def-4fa5-a98c-75bef20ebf0b" (UID: "6e382386-4def-4fa5-a98c-75bef20ebf0b"). InnerVolumeSpecName "kube-api-access-22j4h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:57:44.580564 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580538 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-trusted-ca-bundle\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.580564 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580561 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-service-ca\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.580727 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580570 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e382386-4def-4fa5-a98c-75bef20ebf0b-oauth-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.580727 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580580 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-serving-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.580727 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580589 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22j4h\" (UniqueName: \"kubernetes.io/projected/6e382386-4def-4fa5-a98c-75bef20ebf0b-kube-api-access-22j4h\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:44.580727 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:44.580598 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e382386-4def-4fa5-a98c-75bef20ebf0b-console-oauth-config\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 16:57:45.217229 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217205 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-749865d7b4-rc7tq_6e382386-4def-4fa5-a98c-75bef20ebf0b/console/0.log" Apr 16 16:57:45.217623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217248 2576 generic.go:358] "Generic (PLEG): container finished" podID="6e382386-4def-4fa5-a98c-75bef20ebf0b" containerID="e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13" exitCode=2 Apr 16 16:57:45.217623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-749865d7b4-rc7tq" Apr 16 16:57:45.217623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-749865d7b4-rc7tq" event={"ID":"6e382386-4def-4fa5-a98c-75bef20ebf0b","Type":"ContainerDied","Data":"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13"} Apr 16 16:57:45.217623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-749865d7b4-rc7tq" event={"ID":"6e382386-4def-4fa5-a98c-75bef20ebf0b","Type":"ContainerDied","Data":"0df10e9bc0ca0179481c281a9fb9e15110008607ba6d2a27d19eef3f7cb8def2"} Apr 16 16:57:45.217623 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.217386 2576 scope.go:117] "RemoveContainer" containerID="e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13" Apr 16 16:57:45.226047 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.226031 2576 scope.go:117] "RemoveContainer" containerID="e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13" Apr 16 16:57:45.226282 ip-10-0-130-1 kubenswrapper[2576]: E0416 16:57:45.226264 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13\": container with ID starting with e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13 not found: ID does not exist" containerID="e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13" Apr 16 16:57:45.226356 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.226288 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13"} err="failed to get container status \"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13\": rpc error: code = NotFound desc = could not find container \"e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13\": container with ID starting with e3e8da67d06193e8e66476e9cb0db4a3a00e646fc059c383558b272ca954ec13 not found: ID does not exist" Apr 16 16:57:45.239645 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.239613 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:57:45.243507 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.243485 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-749865d7b4-rc7tq"] Apr 16 16:57:45.509520 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:45.509487 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e382386-4def-4fa5-a98c-75bef20ebf0b" path="/var/lib/kubelet/pods/6e382386-4def-4fa5-a98c-75bef20ebf0b/volumes" Apr 16 16:57:47.225135 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:47.225045 2576 generic.go:358] "Generic (PLEG): container finished" podID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerID="3009ebc4842e6cb42414efd3e222f1944b7fb0fa458f1c5cb868facdb47eb86a" exitCode=0 Apr 16 16:57:47.225135 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:57:47.225124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" event={"ID":"e3bcf0d4-3673-4f6b-a638-04809c5e5256","Type":"ContainerDied","Data":"3009ebc4842e6cb42414efd3e222f1944b7fb0fa458f1c5cb868facdb47eb86a"} Apr 16 16:58:01.280648 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:01.280612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" event={"ID":"e3bcf0d4-3673-4f6b-a638-04809c5e5256","Type":"ContainerStarted","Data":"5b70dd517c1dd5bf49583ed46e7297cd3189da6d6d4dc984e390ce725bbe52a7"} Apr 16 16:58:03.289301 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.289265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" event={"ID":"e3bcf0d4-3673-4f6b-a638-04809c5e5256","Type":"ContainerStarted","Data":"646e519db8777fad14abd7ae4b2ea784a010d9c6ac076f12f586aaba387a2626"} Apr 16 16:58:03.289680 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.289631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:58:03.289680 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.289655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:58:03.291191 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.291139 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:03.291824 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.291802 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:03.306611 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:03.306565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podStartSLOduration=1.805516904 podStartE2EDuration="24.306553789s" podCreationTimestamp="2026-04-16 16:57:39 +0000 UTC" firstStartedPulling="2026-04-16 16:57:39.865617036 +0000 UTC m=+616.963496093" lastFinishedPulling="2026-04-16 16:58:02.366653918 +0000 UTC m=+639.464532978" observedRunningTime="2026-04-16 16:58:03.305382895 +0000 UTC m=+640.403262048" watchObservedRunningTime="2026-04-16 16:58:03.306553789 +0000 UTC m=+640.404432865" Apr 16 16:58:04.292109 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:04.292071 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:04.292482 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:04.292378 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:14.292014 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:14.291916 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:14.292648 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:14.292306 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:24.292495 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:24.292449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:24.293279 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:24.292993 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:34.292751 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:34.292708 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:34.293227 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:34.293194 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:44.292785 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:44.292741 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:44.293269 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:44.293116 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:58:54.292349 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:54.292269 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 16:58:54.292965 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:58:54.292679 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" podUID="e3bcf0d4-3673-4f6b-a638-04809c5e5256" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:59:04.293222 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:04.293132 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:59:04.293731 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:04.293576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw" Apr 16 16:59:09.767566 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.767536 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 16:59:09.768034 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.767852 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e382386-4def-4fa5-a98c-75bef20ebf0b" containerName="console" Apr 16 16:59:09.768034 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.767862 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e382386-4def-4fa5-a98c-75bef20ebf0b" containerName="console" Apr 16 16:59:09.768034 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.767925 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e382386-4def-4fa5-a98c-75bef20ebf0b" containerName="console" Apr 16 16:59:09.771036 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.771020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:09.778008 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.777643 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 16:59:09.809631 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.809596 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 16:59:09.812841 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.812824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:09.820514 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.820493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 16:59:09.829193 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.829171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc\" (UID: \"73978eeb-823c-408c-9097-74093ed71133\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:09.929725 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.929700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4\" (UID: \"471d0f31-b565-488b-bb21-1a41075488e3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:09.929816 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.929784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc\" (UID: \"73978eeb-823c-408c-9097-74093ed71133\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:09.930112 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:09.930095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc\" (UID: \"73978eeb-823c-408c-9097-74093ed71133\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:10.030617 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.030560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4\" (UID: \"471d0f31-b565-488b-bb21-1a41075488e3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:10.030884 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.030867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4\" (UID: \"471d0f31-b565-488b-bb21-1a41075488e3\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:10.081628 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.081608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:10.122536 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.122513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:10.229847 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.229822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 16:59:10.233188 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:59:10.233163 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73978eeb_823c_408c_9097_74093ed71133.slice/crio-0900f11418eddbf0b74bb72a037429aaa4a878c5b38071acfe72d6522af5544a WatchSource:0}: Error finding container 0900f11418eddbf0b74bb72a037429aaa4a878c5b38071acfe72d6522af5544a: Status 404 returned error can't find the container with id 0900f11418eddbf0b74bb72a037429aaa4a878c5b38071acfe72d6522af5544a Apr 16 16:59:10.235346 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.235330 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:59:10.274153 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.274130 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 16:59:10.277023 ip-10-0-130-1 kubenswrapper[2576]: W0416 16:59:10.276987 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod471d0f31_b565_488b_bb21_1a41075488e3.slice/crio-90acd9155bd7295ce4f213aa2f90566b4cf8412d2f51e074d405ab255d80b573 WatchSource:0}: Error finding container 90acd9155bd7295ce4f213aa2f90566b4cf8412d2f51e074d405ab255d80b573: Status 404 returned error can't find the container with id 90acd9155bd7295ce4f213aa2f90566b4cf8412d2f51e074d405ab255d80b573 Apr 16 16:59:10.486016 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.485976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerStarted","Data":"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14"} Apr 16 16:59:10.486231 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.486028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerStarted","Data":"0900f11418eddbf0b74bb72a037429aaa4a878c5b38071acfe72d6522af5544a"} Apr 16 16:59:10.487495 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.487470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerStarted","Data":"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c"} Apr 16 16:59:10.487495 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:10.487497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerStarted","Data":"90acd9155bd7295ce4f213aa2f90566b4cf8412d2f51e074d405ab255d80b573"} Apr 16 16:59:14.502684 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:14.502651 2576 generic.go:358] "Generic (PLEG): container finished" podID="73978eeb-823c-408c-9097-74093ed71133" containerID="b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14" exitCode=0 Apr 16 16:59:14.503045 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:14.502727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerDied","Data":"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14"} Apr 16 16:59:14.504275 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:14.504249 2576 generic.go:358] "Generic (PLEG): container finished" podID="471d0f31-b565-488b-bb21-1a41075488e3" containerID="3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c" exitCode=0 Apr 16 16:59:14.504363 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:14.504294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerDied","Data":"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c"} Apr 16 16:59:15.511783 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:15.511742 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerStarted","Data":"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6"} Apr 16 16:59:15.512217 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:15.512203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 16:59:15.513696 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:15.513451 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 16:59:15.530099 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:15.530012 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podStartSLOduration=6.529996785 podStartE2EDuration="6.529996785s" podCreationTimestamp="2026-04-16 16:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:59:15.527438034 +0000 UTC m=+712.625317112" watchObservedRunningTime="2026-04-16 16:59:15.529996785 +0000 UTC m=+712.627875861" Apr 16 16:59:16.516621 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:16.516578 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 16:59:26.517424 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:26.517376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 16:59:31.571117 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:31.571080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerStarted","Data":"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1"} Apr 16 16:59:31.571568 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:31.571367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 16:59:31.572875 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:31.572845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 16:59:31.589618 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:31.589568 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podStartSLOduration=6.375117678 podStartE2EDuration="22.589557001s" podCreationTimestamp="2026-04-16 16:59:09 +0000 UTC" firstStartedPulling="2026-04-16 16:59:14.505794421 +0000 UTC m=+711.603673475" lastFinishedPulling="2026-04-16 16:59:30.720233744 +0000 UTC m=+727.818112798" observedRunningTime="2026-04-16 16:59:31.586873001 +0000 UTC m=+728.684752077" watchObservedRunningTime="2026-04-16 16:59:31.589557001 +0000 UTC m=+728.687436133" Apr 16 16:59:32.574783 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:32.574743 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 16:59:36.516607 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:36.516564 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 16:59:42.575312 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:42.575276 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 16:59:46.517282 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:46.517246 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 16:59:52.575597 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:52.575551 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 16:59:56.517048 ip-10-0-130-1 kubenswrapper[2576]: I0416 16:59:56.517007 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 17:00:02.575116 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:02.575079 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 17:00:06.516681 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:06.516641 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 17:00:12.575307 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:12.575268 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 17:00:16.517123 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:16.517069 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 17:00:22.575542 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:22.575494 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 17:00:26.517834 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:26.517804 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 17:00:32.576041 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:32.575936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 17:00:49.985161 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:49.985127 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 17:00:49.985725 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:49.985398 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" containerID="cri-o://102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6" gracePeriod=30 Apr 16 17:00:50.032597 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.032567 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:00:50.036109 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.036087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:50.045407 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.045387 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:00:50.079503 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.079476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4\" (UID: \"884f570a-0fa7-4098-b34f-91043f2bfce3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:50.096700 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.096677 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 17:00:50.096937 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.096917 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" containerID="cri-o://b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1" gracePeriod=30 Apr 16 17:00:50.114353 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.114332 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:00:50.117560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.117542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:50.123045 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.123022 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:00:50.180134 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.180100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4\" (UID: \"884f570a-0fa7-4098-b34f-91043f2bfce3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:50.180275 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.180151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx\" (UID: \"21225193-7f3e-447f-b99c-0926020bbdd9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:50.180445 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.180425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4\" (UID: \"884f570a-0fa7-4098-b34f-91043f2bfce3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:50.281266 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.281190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx\" (UID: \"21225193-7f3e-447f-b99c-0926020bbdd9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:50.281564 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.281544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx\" (UID: \"21225193-7f3e-447f-b99c-0926020bbdd9\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:50.348860 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.348829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:50.428261 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.428231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:50.483705 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.483581 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:00:50.488652 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:00:50.488589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884f570a_0fa7_4098_b34f_91043f2bfce3.slice/crio-93ba15c324f7cf3c69c727c1c823b2570e4776d2d74dc6aab4940ba81d51c173 WatchSource:0}: Error finding container 93ba15c324f7cf3c69c727c1c823b2570e4776d2d74dc6aab4940ba81d51c173: Status 404 returned error can't find the container with id 93ba15c324f7cf3c69c727c1c823b2570e4776d2d74dc6aab4940ba81d51c173 Apr 16 17:00:50.561787 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.561760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:00:50.564062 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:00:50.564037 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21225193_7f3e_447f_b99c_0926020bbdd9.slice/crio-43049eb10ae380b005aa78a48167942a38fcec053fc0541dfc819581ff3df630 WatchSource:0}: Error finding container 43049eb10ae380b005aa78a48167942a38fcec053fc0541dfc819581ff3df630: Status 404 returned error can't find the container with id 43049eb10ae380b005aa78a48167942a38fcec053fc0541dfc819581ff3df630 Apr 16 17:00:50.827596 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.827503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerStarted","Data":"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859"} Apr 16 17:00:50.827596 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.827540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerStarted","Data":"43049eb10ae380b005aa78a48167942a38fcec053fc0541dfc819581ff3df630"} Apr 16 17:00:50.829046 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.829018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerStarted","Data":"2076bea8182847ee8402f5ca990d91b690e5246cd160b9e9e82e808fa5ffbfea"} Apr 16 17:00:50.829046 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:50.829049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerStarted","Data":"93ba15c324f7cf3c69c727c1c823b2570e4776d2d74dc6aab4940ba81d51c173"} Apr 16 17:00:52.574878 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:52.574841 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 17:00:53.547932 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.547911 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 17:00:53.612999 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.612970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location\") pod \"471d0f31-b565-488b-bb21-1a41075488e3\" (UID: \"471d0f31-b565-488b-bb21-1a41075488e3\") " Apr 16 17:00:53.613424 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.613295 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "471d0f31-b565-488b-bb21-1a41075488e3" (UID: "471d0f31-b565-488b-bb21-1a41075488e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:53.713633 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.713564 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/471d0f31-b565-488b-bb21-1a41075488e3-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:00:53.841715 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.841681 2576 generic.go:358] "Generic (PLEG): container finished" podID="471d0f31-b565-488b-bb21-1a41075488e3" containerID="b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1" exitCode=0 Apr 16 17:00:53.841853 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.841726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerDied","Data":"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1"} Apr 16 17:00:53.841853 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.841754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" event={"ID":"471d0f31-b565-488b-bb21-1a41075488e3","Type":"ContainerDied","Data":"90acd9155bd7295ce4f213aa2f90566b4cf8412d2f51e074d405ab255d80b573"} Apr 16 17:00:53.841853 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.841755 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4" Apr 16 17:00:53.841853 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.841769 2576 scope.go:117] "RemoveContainer" containerID="b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1" Apr 16 17:00:53.850382 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.850362 2576 scope.go:117] "RemoveContainer" containerID="3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c" Apr 16 17:00:53.857624 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.857609 2576 scope.go:117] "RemoveContainer" containerID="b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1" Apr 16 17:00:53.857842 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:00:53.857825 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1\": container with ID starting with b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1 not found: ID does not exist" containerID="b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1" Apr 16 17:00:53.857883 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.857852 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1"} err="failed to get container status \"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1\": rpc error: code = NotFound desc = could not find container \"b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1\": container with ID starting with b49e4f34210b44a32d7ce289ba526b8b1cd8599a9cb661a76205a921556feac1 not found: ID does not exist" Apr 16 17:00:53.857883 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.857868 2576 scope.go:117] "RemoveContainer" containerID="3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c" Apr 16 17:00:53.858078 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:00:53.858064 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c\": container with ID starting with 3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c not found: ID does not exist" containerID="3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c" Apr 16 17:00:53.858127 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.858083 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c"} err="failed to get container status \"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c\": rpc error: code = NotFound desc = could not find container \"3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c\": container with ID starting with 3efe59c179d90fe7b952b60432a24be04bf8d9fbd362f8347163ee030bd7fb0c not found: ID does not exist" Apr 16 17:00:53.862041 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.862022 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 17:00:53.865333 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:53.865314 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-231d1-predictor-5f9bd8476c-r6jf4"] Apr 16 17:00:54.139598 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.138660 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 17:00:54.218497 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.218464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location\") pod \"73978eeb-823c-408c-9097-74093ed71133\" (UID: \"73978eeb-823c-408c-9097-74093ed71133\") " Apr 16 17:00:54.218779 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.218758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73978eeb-823c-408c-9097-74093ed71133" (UID: "73978eeb-823c-408c-9097-74093ed71133"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:54.319682 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.319657 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73978eeb-823c-408c-9097-74093ed71133-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:00:54.846190 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.846161 2576 generic.go:358] "Generic (PLEG): container finished" podID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerID="2076bea8182847ee8402f5ca990d91b690e5246cd160b9e9e82e808fa5ffbfea" exitCode=0 Apr 16 17:00:54.846560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.846236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerDied","Data":"2076bea8182847ee8402f5ca990d91b690e5246cd160b9e9e82e808fa5ffbfea"} Apr 16 17:00:54.848383 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.848357 2576 generic.go:358] "Generic (PLEG): container finished" podID="73978eeb-823c-408c-9097-74093ed71133" containerID="102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6" exitCode=0 Apr 16 17:00:54.848504 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.848412 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" Apr 16 17:00:54.848504 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.848441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerDied","Data":"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6"} Apr 16 17:00:54.848504 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.848472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc" event={"ID":"73978eeb-823c-408c-9097-74093ed71133","Type":"ContainerDied","Data":"0900f11418eddbf0b74bb72a037429aaa4a878c5b38071acfe72d6522af5544a"} Apr 16 17:00:54.848504 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.848505 2576 scope.go:117] "RemoveContainer" containerID="102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6" Apr 16 17:00:54.850010 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.849983 2576 generic.go:358] "Generic (PLEG): container finished" podID="21225193-7f3e-447f-b99c-0926020bbdd9" containerID="927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859" exitCode=0 Apr 16 17:00:54.850212 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.850051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerDied","Data":"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859"} Apr 16 17:00:54.859834 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.859809 2576 scope.go:117] "RemoveContainer" containerID="b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14" Apr 16 17:00:54.874182 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.874160 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 17:00:54.877977 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.877958 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-231d1-predictor-5487fb9697-79khc"] Apr 16 17:00:54.878590 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.878574 2576 scope.go:117] "RemoveContainer" containerID="102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6" Apr 16 17:00:54.878880 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:00:54.878848 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6\": container with ID starting with 102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6 not found: ID does not exist" containerID="102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6" Apr 16 17:00:54.878936 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.878874 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6"} err="failed to get container status \"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6\": rpc error: code = NotFound desc = could not find container \"102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6\": container with ID starting with 102dd22e0bd41ef917a55641a691594ce932096a0aeefb58db9ae4b0f4133eb6 not found: ID does not exist" Apr 16 17:00:54.878936 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.878892 2576 scope.go:117] "RemoveContainer" containerID="b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14" Apr 16 17:00:54.879166 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:00:54.879149 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14\": container with ID starting with b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14 not found: ID does not exist" containerID="b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14" Apr 16 17:00:54.879214 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:54.879173 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14"} err="failed to get container status \"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14\": rpc error: code = NotFound desc = could not find container \"b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14\": container with ID starting with b7da4a5803884a794fffd894f88bebd7d7e3fcbb4bc70a7c37ab87d25619fd14 not found: ID does not exist" Apr 16 17:00:55.511458 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.511421 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471d0f31-b565-488b-bb21-1a41075488e3" path="/var/lib/kubelet/pods/471d0f31-b565-488b-bb21-1a41075488e3/volumes" Apr 16 17:00:55.511879 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.511859 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73978eeb-823c-408c-9097-74093ed71133" path="/var/lib/kubelet/pods/73978eeb-823c-408c-9097-74093ed71133/volumes" Apr 16 17:00:55.856008 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.855918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerStarted","Data":"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891"} Apr 16 17:00:55.856382 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.856284 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:00:55.857693 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.857669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerStarted","Data":"39e17800d419aea33666598a131997585d7e0304122b675e1d4c2cf5722a9dc6"} Apr 16 17:00:55.857891 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.857867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:00:55.857993 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.857973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:00:55.858727 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.858705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:00:55.873306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.873266 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podStartSLOduration=5.873256314 podStartE2EDuration="5.873256314s" podCreationTimestamp="2026-04-16 17:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:00:55.87149369 +0000 UTC m=+812.969372767" watchObservedRunningTime="2026-04-16 17:00:55.873256314 +0000 UTC m=+812.971135390" Apr 16 17:00:55.887161 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:55.887114 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podStartSLOduration=5.887103594 podStartE2EDuration="5.887103594s" podCreationTimestamp="2026-04-16 17:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:00:55.885375958 +0000 UTC m=+812.983255065" watchObservedRunningTime="2026-04-16 17:00:55.887103594 +0000 UTC m=+812.984982670" Apr 16 17:00:56.861445 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:56.861405 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:00:56.861817 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:00:56.861405 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:06.862074 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:06.862025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:06.862536 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:06.862028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:16.861753 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:16.861709 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:16.862199 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:16.861709 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:26.862142 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:26.862097 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:26.862519 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:26.862097 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:36.862103 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:36.862008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:36.862103 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:36.862008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:46.861735 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:46.861697 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:01:46.862121 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:46.861705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:56.861612 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:56.861569 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 17:01:56.862507 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:01:56.862490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:02:06.862205 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:06.862124 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:02:23.466828 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:23.466803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 17:02:23.467248 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:23.467036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 17:02:30.268176 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:30.268142 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:02:30.268756 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:30.268566 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" containerID="cri-o://39e17800d419aea33666598a131997585d7e0304122b675e1d4c2cf5722a9dc6" gracePeriod=30 Apr 16 17:02:30.346542 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:30.346504 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:02:30.346826 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:30.346789 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" containerID="cri-o://f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891" gracePeriod=30 Apr 16 17:02:33.480129 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:33.480100 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:02:33.649002 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:33.648921 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location\") pod \"21225193-7f3e-447f-b99c-0926020bbdd9\" (UID: \"21225193-7f3e-447f-b99c-0926020bbdd9\") " Apr 16 17:02:33.649214 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:33.649192 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21225193-7f3e-447f-b99c-0926020bbdd9" (UID: "21225193-7f3e-447f-b99c-0926020bbdd9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:33.750287 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:33.750263 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21225193-7f3e-447f-b99c-0926020bbdd9-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:02:34.198277 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.198246 2576 generic.go:358] "Generic (PLEG): container finished" podID="21225193-7f3e-447f-b99c-0926020bbdd9" containerID="f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891" exitCode=0 Apr 16 17:02:34.198476 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.198300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerDied","Data":"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891"} Apr 16 17:02:34.198476 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.198320 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" Apr 16 17:02:34.198476 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.198338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx" event={"ID":"21225193-7f3e-447f-b99c-0926020bbdd9","Type":"ContainerDied","Data":"43049eb10ae380b005aa78a48167942a38fcec053fc0541dfc819581ff3df630"} Apr 16 17:02:34.198476 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.198359 2576 scope.go:117] "RemoveContainer" containerID="f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891" Apr 16 17:02:34.200426 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.200399 2576 generic.go:358] "Generic (PLEG): container finished" podID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerID="39e17800d419aea33666598a131997585d7e0304122b675e1d4c2cf5722a9dc6" exitCode=0 Apr 16 17:02:34.200552 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.200439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerDied","Data":"39e17800d419aea33666598a131997585d7e0304122b675e1d4c2cf5722a9dc6"} Apr 16 17:02:34.207212 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.207192 2576 scope.go:117] "RemoveContainer" containerID="927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859" Apr 16 17:02:34.214882 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.214864 2576 scope.go:117] "RemoveContainer" containerID="f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891" Apr 16 17:02:34.215189 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:02:34.215170 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891\": container with ID starting with f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891 not found: ID does not exist" containerID="f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891" Apr 16 17:02:34.215278 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.215201 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891"} err="failed to get container status \"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891\": rpc error: code = NotFound desc = could not find container \"f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891\": container with ID starting with f92b64fc945806c85dc93418778c076913f0ab0e47898c4d81825e8359291891 not found: ID does not exist" Apr 16 17:02:34.215278 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.215224 2576 scope.go:117] "RemoveContainer" containerID="927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859" Apr 16 17:02:34.215480 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:02:34.215463 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859\": container with ID starting with 927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859 not found: ID does not exist" containerID="927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859" Apr 16 17:02:34.215515 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.215488 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859"} err="failed to get container status \"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859\": rpc error: code = NotFound desc = could not find container \"927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859\": container with ID starting with 927de74b16c9052fabfced494b9ffce0d3e51f68741d002fa8b57419d186b859 not found: ID does not exist" Apr 16 17:02:34.220119 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.220096 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:02:34.226343 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.226320 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4ab2c-predictor-68cc9b64ff-hcdmx"] Apr 16 17:02:34.307323 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.307300 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:02:34.454628 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.454531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location\") pod \"884f570a-0fa7-4098-b34f-91043f2bfce3\" (UID: \"884f570a-0fa7-4098-b34f-91043f2bfce3\") " Apr 16 17:02:34.454851 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.454829 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "884f570a-0fa7-4098-b34f-91043f2bfce3" (UID: "884f570a-0fa7-4098-b34f-91043f2bfce3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:02:34.555836 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:34.555799 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/884f570a-0fa7-4098-b34f-91043f2bfce3-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:02:35.206649 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.206613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" event={"ID":"884f570a-0fa7-4098-b34f-91043f2bfce3","Type":"ContainerDied","Data":"93ba15c324f7cf3c69c727c1c823b2570e4776d2d74dc6aab4940ba81d51c173"} Apr 16 17:02:35.206819 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.206660 2576 scope.go:117] "RemoveContainer" containerID="39e17800d419aea33666598a131997585d7e0304122b675e1d4c2cf5722a9dc6" Apr 16 17:02:35.206819 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.206670 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4" Apr 16 17:02:35.215326 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.215299 2576 scope.go:117] "RemoveContainer" containerID="2076bea8182847ee8402f5ca990d91b690e5246cd160b9e9e82e808fa5ffbfea" Apr 16 17:02:35.227623 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.227604 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:02:35.231469 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.231452 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-4ab2c-predictor-75d57fbf7c-blvm4"] Apr 16 17:02:35.508637 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.508606 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" path="/var/lib/kubelet/pods/21225193-7f3e-447f-b99c-0926020bbdd9/volumes" Apr 16 17:02:35.508973 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:35.508961 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" path="/var/lib/kubelet/pods/884f570a-0fa7-4098-b34f-91043f2bfce3/volumes" Apr 16 17:02:40.347766 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.347730 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj"] Apr 16 17:02:40.348206 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348190 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348209 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348221 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="storage-initializer" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348229 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="storage-initializer" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348243 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348251 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348261 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348269 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348280 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="storage-initializer" Apr 16 17:02:40.348306 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348288 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="storage-initializer" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348312 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="storage-initializer" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348320 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="storage-initializer" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="storage-initializer" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348339 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="storage-initializer" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348348 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348356 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348434 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="21225193-7f3e-447f-b99c-0926020bbdd9" containerName="kserve-container" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348446 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="471d0f31-b565-488b-bb21-1a41075488e3" containerName="kserve-container" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348455 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="884f570a-0fa7-4098-b34f-91043f2bfce3" containerName="kserve-container" Apr 16 17:02:40.348713 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.348466 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="73978eeb-823c-408c-9097-74093ed71133" containerName="kserve-container" Apr 16 17:02:40.350512 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.350491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:40.359749 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.359716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj"] Apr 16 17:02:40.499039 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.499017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2086a436-bded-4cab-b596-ed5ffdade66e-kserve-provision-location\") pod \"isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj\" (UID: \"2086a436-bded-4cab-b596-ed5ffdade66e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:40.600176 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.600093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2086a436-bded-4cab-b596-ed5ffdade66e-kserve-provision-location\") pod \"isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj\" (UID: \"2086a436-bded-4cab-b596-ed5ffdade66e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:40.600635 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.600610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2086a436-bded-4cab-b596-ed5ffdade66e-kserve-provision-location\") pod \"isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj\" (UID: \"2086a436-bded-4cab-b596-ed5ffdade66e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:40.661097 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.661078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:40.780092 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:40.780044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj"] Apr 16 17:02:40.783396 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:02:40.783366 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2086a436_bded_4cab_b596_ed5ffdade66e.slice/crio-68b0628516008dc44be32c8312faf024b701ddb6c50535205043ef0547b89c26 WatchSource:0}: Error finding container 68b0628516008dc44be32c8312faf024b701ddb6c50535205043ef0547b89c26: Status 404 returned error can't find the container with id 68b0628516008dc44be32c8312faf024b701ddb6c50535205043ef0547b89c26 Apr 16 17:02:41.227655 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:41.227623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" event={"ID":"2086a436-bded-4cab-b596-ed5ffdade66e","Type":"ContainerStarted","Data":"ec22ce850ae3d01f1ac2a95f9982cfb30cd02ef6cebd6a97e2286595cc42439f"} Apr 16 17:02:41.227655 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:41.227660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" event={"ID":"2086a436-bded-4cab-b596-ed5ffdade66e","Type":"ContainerStarted","Data":"68b0628516008dc44be32c8312faf024b701ddb6c50535205043ef0547b89c26"} Apr 16 17:02:45.241251 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:45.241218 2576 generic.go:358] "Generic (PLEG): container finished" podID="2086a436-bded-4cab-b596-ed5ffdade66e" containerID="ec22ce850ae3d01f1ac2a95f9982cfb30cd02ef6cebd6a97e2286595cc42439f" exitCode=0 Apr 16 17:02:45.241621 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:45.241293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" event={"ID":"2086a436-bded-4cab-b596-ed5ffdade66e","Type":"ContainerDied","Data":"ec22ce850ae3d01f1ac2a95f9982cfb30cd02ef6cebd6a97e2286595cc42439f"} Apr 16 17:02:46.246230 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:46.246197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" event={"ID":"2086a436-bded-4cab-b596-ed5ffdade66e","Type":"ContainerStarted","Data":"7aa923615db6c66231352dd93e44990e4a45cc5a58036d9f9cbff41e820b26ef"} Apr 16 17:02:46.246230 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:46.246235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" event={"ID":"2086a436-bded-4cab-b596-ed5ffdade66e","Type":"ContainerStarted","Data":"4e2fa31ce7bae2faafc211dce9f140ea4dcb7bf9fe64543c70d6894c32d9cfcb"} Apr 16 17:02:46.246686 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:46.246483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:46.247835 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:46.247808 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:02:46.262660 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:46.262618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podStartSLOduration=6.262605865 podStartE2EDuration="6.262605865s" podCreationTimestamp="2026-04-16 17:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:02:46.260882226 +0000 UTC m=+923.358761294" watchObservedRunningTime="2026-04-16 17:02:46.262605865 +0000 UTC m=+923.360485010" Apr 16 17:02:47.249234 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:47.249202 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:02:47.249690 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:47.249366 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:02:47.250190 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:47.250165 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:48.253150 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:48.253103 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:02:48.253567 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:48.253371 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:58.253065 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:58.253020 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:02:58.253559 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:02:58.253536 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:08.253460 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:08.253409 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:03:08.253930 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:08.253831 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:18.253309 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:18.253262 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:03:18.253706 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:18.253623 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:28.253276 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:28.253228 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:03:28.253759 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:28.253729 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:38.253682 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:38.253630 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:03:38.254117 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:38.254033 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" podUID="2086a436-bded-4cab-b596-ed5ffdade66e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:48.253704 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:48.253672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:03:48.254197 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:48.253744 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj" Apr 16 17:03:50.567259 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.567232 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw"] Apr 16 17:03:50.570657 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.570638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:50.577835 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.577813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw"] Apr 16 17:03:50.687817 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.687790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e9ca9e5-8d6f-44e8-a9d0-147483751715-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw\" (UID: \"2e9ca9e5-8d6f-44e8-a9d0-147483751715\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:50.788162 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.788134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e9ca9e5-8d6f-44e8-a9d0-147483751715-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw\" (UID: \"2e9ca9e5-8d6f-44e8-a9d0-147483751715\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:50.788442 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.788427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e9ca9e5-8d6f-44e8-a9d0-147483751715-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw\" (UID: \"2e9ca9e5-8d6f-44e8-a9d0-147483751715\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:50.881732 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.881669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:50.999180 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:50.999157 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw"] Apr 16 17:03:51.001164 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:03:51.001137 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9ca9e5_8d6f_44e8_a9d0_147483751715.slice/crio-60393079ef5c70763a22f4d3d7aefef97f3876129d52497a7492caea81dce0d2 WatchSource:0}: Error finding container 60393079ef5c70763a22f4d3d7aefef97f3876129d52497a7492caea81dce0d2: Status 404 returned error can't find the container with id 60393079ef5c70763a22f4d3d7aefef97f3876129d52497a7492caea81dce0d2 Apr 16 17:03:51.451999 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:51.451963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" event={"ID":"2e9ca9e5-8d6f-44e8-a9d0-147483751715","Type":"ContainerStarted","Data":"af38ce1b2832f5358c33f9b2eb221351c03237e4f97f7f68937b57c554b01cd6"} Apr 16 17:03:51.451999 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:51.452004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" event={"ID":"2e9ca9e5-8d6f-44e8-a9d0-147483751715","Type":"ContainerStarted","Data":"60393079ef5c70763a22f4d3d7aefef97f3876129d52497a7492caea81dce0d2"} Apr 16 17:03:55.468438 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:55.468406 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerID="af38ce1b2832f5358c33f9b2eb221351c03237e4f97f7f68937b57c554b01cd6" exitCode=0 Apr 16 17:03:55.468773 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:55.468442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" event={"ID":"2e9ca9e5-8d6f-44e8-a9d0-147483751715","Type":"ContainerDied","Data":"af38ce1b2832f5358c33f9b2eb221351c03237e4f97f7f68937b57c554b01cd6"} Apr 16 17:03:56.472745 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:56.472709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" event={"ID":"2e9ca9e5-8d6f-44e8-a9d0-147483751715","Type":"ContainerStarted","Data":"72631db569f72ee75ab9a0f26f99a7d7d3a78696f623aff6f6e723b2be9cb6f6"} Apr 16 17:03:56.473177 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:56.473071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:03:56.474286 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:56.474260 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:03:56.488992 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:56.488931 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podStartSLOduration=6.488919502 podStartE2EDuration="6.488919502s" podCreationTimestamp="2026-04-16 17:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:03:56.487585173 +0000 UTC m=+993.585464285" watchObservedRunningTime="2026-04-16 17:03:56.488919502 +0000 UTC m=+993.586798576" Apr 16 17:03:57.476530 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:03:57.476486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:07.477259 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:07.477221 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:17.477196 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:17.477155 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:27.476637 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:27.476591 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:37.477415 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:37.477373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:47.476706 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:47.476661 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:04:57.477100 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:04:57.477061 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:07.476870 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:07.476790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:16.505417 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:16.505377 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:26.505357 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:26.505315 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:36.506293 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:36.506251 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:46.505376 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:46.505329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" podUID="2e9ca9e5-8d6f-44e8-a9d0-147483751715" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:05:56.507180 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:05:56.507150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw" Apr 16 17:06:00.818234 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:00.818203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:06:00.821605 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:00.821590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:00.827210 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:00.827188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:06:00.909517 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:00.909481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location\") pod \"isvc-primary-69a2b2-predictor-56f89765c8-tzp9f\" (UID: \"53f05150-500a-4d35-8473-a1c6ae6ae157\") " pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:01.010775 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.010731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location\") pod \"isvc-primary-69a2b2-predictor-56f89765c8-tzp9f\" (UID: \"53f05150-500a-4d35-8473-a1c6ae6ae157\") " pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:01.011125 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.011107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location\") pod \"isvc-primary-69a2b2-predictor-56f89765c8-tzp9f\" (UID: \"53f05150-500a-4d35-8473-a1c6ae6ae157\") " pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:01.132503 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.132402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:01.257332 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.257304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:06:01.259913 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:06:01.259882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f05150_500a_4d35_8473_a1c6ae6ae157.slice/crio-c93c3f2acb830f5356c2ae8a8d34d5e5e7e819f031ccf54af3b188171e74af46 WatchSource:0}: Error finding container c93c3f2acb830f5356c2ae8a8d34d5e5e7e819f031ccf54af3b188171e74af46: Status 404 returned error can't find the container with id c93c3f2acb830f5356c2ae8a8d34d5e5e7e819f031ccf54af3b188171e74af46 Apr 16 17:06:01.262124 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.262107 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:06:01.869141 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.869106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerStarted","Data":"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73"} Apr 16 17:06:01.869141 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:01.869143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerStarted","Data":"c93c3f2acb830f5356c2ae8a8d34d5e5e7e819f031ccf54af3b188171e74af46"} Apr 16 17:06:05.886249 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:05.886215 2576 generic.go:358] "Generic (PLEG): container finished" podID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerID="481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73" exitCode=0 Apr 16 17:06:05.886616 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:05.886286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerDied","Data":"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73"} Apr 16 17:06:06.891061 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:06.890978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerStarted","Data":"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0"} Apr 16 17:06:06.891438 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:06.891278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:06:06.892315 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:06.892292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:06.907655 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:06.907611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podStartSLOduration=6.907600176 podStartE2EDuration="6.907600176s" podCreationTimestamp="2026-04-16 17:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:06:06.906460727 +0000 UTC m=+1124.004339804" watchObservedRunningTime="2026-04-16 17:06:06.907600176 +0000 UTC m=+1124.005479251" Apr 16 17:06:07.895531 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:07.895489 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:17.895892 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:17.895839 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:27.896316 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:27.896268 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:37.895540 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:37.895451 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:47.896068 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:47.896025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:06:57.896340 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:06:57.896294 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:07:07.896019 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:07.895972 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 17:07:17.896915 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:17.896880 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:07:20.981753 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.981715 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:20.985208 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.985188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:20.988315 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.988291 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-69a2b2-dockercfg-sw92g\"" Apr 16 17:07:20.988843 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.988820 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-69a2b2\"" Apr 16 17:07:20.989547 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.989525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 17:07:20.994206 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:20.994186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:21.045862 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.045829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.045862 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.045869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.146373 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.146333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.146373 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.146376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.146734 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.146715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.147084 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.147065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert\") pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.297812 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.297716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:21.426787 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:21.426761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:21.429465 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:07:21.429435 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e0a32f_691b_4e70_98b2_50a42b061857.slice/crio-b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6 WatchSource:0}: Error finding container b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6: Status 404 returned error can't find the container with id b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6 Apr 16 17:07:22.138573 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:22.138533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerStarted","Data":"76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d"} Apr 16 17:07:22.138573 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:22.138571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerStarted","Data":"b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6"} Apr 16 17:07:23.491238 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:23.491208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 17:07:23.492656 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:23.492635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 17:07:27.156867 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:27.156837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/0.log" Apr 16 17:07:27.157298 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:27.156878 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerID="76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d" exitCode=1 Apr 16 17:07:27.157298 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:27.156935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerDied","Data":"76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d"} Apr 16 17:07:28.162735 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:28.162705 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/0.log" Apr 16 17:07:28.163118 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:28.162771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerStarted","Data":"2c7d4107813a50cd89594a9982668d41d3e73913b6642b27d90da8cb1e7a4992"} Apr 16 17:07:30.172081 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/1.log" Apr 16 17:07:30.172556 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172427 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/0.log" Apr 16 17:07:30.172556 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172466 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerID="2c7d4107813a50cd89594a9982668d41d3e73913b6642b27d90da8cb1e7a4992" exitCode=1 Apr 16 17:07:30.172556 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerDied","Data":"2c7d4107813a50cd89594a9982668d41d3e73913b6642b27d90da8cb1e7a4992"} Apr 16 17:07:30.172705 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172562 2576 scope.go:117] "RemoveContainer" containerID="76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d" Apr 16 17:07:30.172933 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:30.172914 2576 scope.go:117] "RemoveContainer" containerID="76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d" Apr 16 17:07:30.186300 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:30.186243 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_kserve-ci-e2e-test_c5e0a32f-691b-4e70-98b2-50a42b061857_0 in pod sandbox b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6 from index: no such id: '76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d'" containerID="76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d" Apr 16 17:07:30.186422 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:30.186317 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_kserve-ci-e2e-test_c5e0a32f-691b-4e70-98b2-50a42b061857_0 in pod sandbox b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6 from index: no such id: '76d342f3977c75695e26b6696bae74df361e6e369f7a2263187e45a68b7c4a5d'; Skipping pod \"isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_kserve-ci-e2e-test(c5e0a32f-691b-4e70-98b2-50a42b061857)\"" logger="UnhandledError" Apr 16 17:07:30.187749 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:30.187725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_kserve-ci-e2e-test(c5e0a32f-691b-4e70-98b2-50a42b061857)\"" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" Apr 16 17:07:31.178091 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:31.178057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/1.log" Apr 16 17:07:39.009636 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.009600 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:39.056219 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.056185 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:07:39.056522 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.056487 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" containerID="cri-o://b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0" gracePeriod=30 Apr 16 17:07:39.149805 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.149775 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:39.154000 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.153980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.157474 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.157449 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-7ebb90\"" Apr 16 17:07:39.157591 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.157491 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-7ebb90-dockercfg-x6bs5\"" Apr 16 17:07:39.158754 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.158732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/1.log" Apr 16 17:07:39.158897 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.158806 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:39.161378 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.161355 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:39.204281 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204240 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert\") pod \"c5e0a32f-691b-4e70-98b2-50a42b061857\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " Apr 16 17:07:39.204475 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204323 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location\") pod \"c5e0a32f-691b-4e70-98b2-50a42b061857\" (UID: \"c5e0a32f-691b-4e70-98b2-50a42b061857\") " Apr 16 17:07:39.204475 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.204605 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.204666 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204598 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5e0a32f-691b-4e70-98b2-50a42b061857" (UID: "c5e0a32f-691b-4e70-98b2-50a42b061857"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:07:39.204666 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.204605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c5e0a32f-691b-4e70-98b2-50a42b061857" (UID: "c5e0a32f-691b-4e70-98b2-50a42b061857"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:07:39.205330 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.205311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8_c5e0a32f-691b-4e70-98b2-50a42b061857/storage-initializer/1.log" Apr 16 17:07:39.205461 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.205441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" event={"ID":"c5e0a32f-691b-4e70-98b2-50a42b061857","Type":"ContainerDied","Data":"b05107b065604e827a8d604e43d0fcf53dd3f44f527a781acb66c8abfe5c7fd6"} Apr 16 17:07:39.205531 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.205475 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8" Apr 16 17:07:39.205531 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.205481 2576 scope.go:117] "RemoveContainer" containerID="2c7d4107813a50cd89594a9982668d41d3e73913b6642b27d90da8cb1e7a4992" Apr 16 17:07:39.238469 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.238437 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:39.243113 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.243083 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-69a2b2-predictor-bc7cfb67-f5tl8"] Apr 16 17:07:39.305366 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.305366 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.305608 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305397 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c5e0a32f-691b-4e70-98b2-50a42b061857-cabundle-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:07:39.305608 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305409 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5e0a32f-691b-4e70-98b2-50a42b061857-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:07:39.305818 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.305930 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.305916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert\") pod \"isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.469184 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.469144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:39.510434 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.510401 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" path="/var/lib/kubelet/pods/c5e0a32f-691b-4e70-98b2-50a42b061857/volumes" Apr 16 17:07:39.598188 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:39.598164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:39.600345 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:07:39.600316 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4c3af6_5616_42b6_9b89_ee283f3cf89b.slice/crio-d52fbe544174f4fef0221ec9f68a87a71dd1c84cf98a9d4347063a8d1cd52f0a WatchSource:0}: Error finding container d52fbe544174f4fef0221ec9f68a87a71dd1c84cf98a9d4347063a8d1cd52f0a: Status 404 returned error can't find the container with id d52fbe544174f4fef0221ec9f68a87a71dd1c84cf98a9d4347063a8d1cd52f0a Apr 16 17:07:40.220417 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:40.220377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerStarted","Data":"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a"} Apr 16 17:07:40.221029 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:40.220424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerStarted","Data":"d52fbe544174f4fef0221ec9f68a87a71dd1c84cf98a9d4347063a8d1cd52f0a"} Apr 16 17:07:43.232511 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.232474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/0.log" Apr 16 17:07:43.232892 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.232521 2576 generic.go:358] "Generic (PLEG): container finished" podID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerID="07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a" exitCode=1 Apr 16 17:07:43.232892 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.232609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerDied","Data":"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a"} Apr 16 17:07:43.612597 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.612571 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:07:43.743540 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.743507 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location\") pod \"53f05150-500a-4d35-8473-a1c6ae6ae157\" (UID: \"53f05150-500a-4d35-8473-a1c6ae6ae157\") " Apr 16 17:07:43.743926 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.743895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53f05150-500a-4d35-8473-a1c6ae6ae157" (UID: "53f05150-500a-4d35-8473-a1c6ae6ae157"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:07:43.844970 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:43.844848 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53f05150-500a-4d35-8473-a1c6ae6ae157-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:07:44.153159 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.153065 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:44.237966 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.237914 2576 generic.go:358] "Generic (PLEG): container finished" podID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerID="b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0" exitCode=0 Apr 16 17:07:44.238349 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.238008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerDied","Data":"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0"} Apr 16 17:07:44.238349 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.238037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" event={"ID":"53f05150-500a-4d35-8473-a1c6ae6ae157","Type":"ContainerDied","Data":"c93c3f2acb830f5356c2ae8a8d34d5e5e7e819f031ccf54af3b188171e74af46"} Apr 16 17:07:44.238349 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.238045 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f" Apr 16 17:07:44.238349 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.238054 2576 scope.go:117] "RemoveContainer" containerID="b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0" Apr 16 17:07:44.240330 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.240312 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/0.log" Apr 16 17:07:44.240443 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.240389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerStarted","Data":"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b"} Apr 16 17:07:44.240600 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.240572 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" containerID="cri-o://423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b" gracePeriod=30 Apr 16 17:07:44.247182 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.247160 2576 scope.go:117] "RemoveContainer" containerID="481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73" Apr 16 17:07:44.255163 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.255138 2576 scope.go:117] "RemoveContainer" containerID="b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0" Apr 16 17:07:44.255413 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:44.255395 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0\": container with ID starting with b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0 not found: ID does not exist" containerID="b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0" Apr 16 17:07:44.255460 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.255424 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0"} err="failed to get container status \"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0\": rpc error: code = NotFound desc = could not find container \"b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0\": container with ID starting with b3c988dad050153474e136d9b996eb29cbead886d7d4d87417d8167b00d7cae0 not found: ID does not exist" Apr 16 17:07:44.255460 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.255442 2576 scope.go:117] "RemoveContainer" containerID="481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73" Apr 16 17:07:44.255706 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:44.255683 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73\": container with ID starting with 481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73 not found: ID does not exist" containerID="481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73" Apr 16 17:07:44.255783 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.255719 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73"} err="failed to get container status \"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73\": rpc error: code = NotFound desc = could not find container \"481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73\": container with ID starting with 481ee1676c0939fc3d1ee0800a2d8ed25764be04a0dd14cfd5a115967cdfda73 not found: ID does not exist" Apr 16 17:07:44.273787 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.273761 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:07:44.281613 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.281588 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-69a2b2-predictor-56f89765c8-tzp9f"] Apr 16 17:07:44.285640 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285618 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l"] Apr 16 17:07:44.285970 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285956 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285971 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285980 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285986 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.285999 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286019 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="storage-initializer" Apr 16 17:07:44.286033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286024 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="storage-initializer" Apr 16 17:07:44.286248 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286072 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" containerName="kserve-container" Apr 16 17:07:44.286248 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286082 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.286248 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.286088 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5e0a32f-691b-4e70-98b2-50a42b061857" containerName="storage-initializer" Apr 16 17:07:44.290435 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.290417 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:44.299121 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.299098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l"] Apr 16 17:07:44.349296 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.349252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d315dc-af6b-4a1d-8105-bd1156211f7d-kserve-provision-location\") pod \"raw-sklearn-19113-predictor-6dbb9fd749-wfs4l\" (UID: \"92d315dc-af6b-4a1d-8105-bd1156211f7d\") " pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:44.450657 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.450575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d315dc-af6b-4a1d-8105-bd1156211f7d-kserve-provision-location\") pod \"raw-sklearn-19113-predictor-6dbb9fd749-wfs4l\" (UID: \"92d315dc-af6b-4a1d-8105-bd1156211f7d\") " pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:44.451003 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.450984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d315dc-af6b-4a1d-8105-bd1156211f7d-kserve-provision-location\") pod \"raw-sklearn-19113-predictor-6dbb9fd749-wfs4l\" (UID: \"92d315dc-af6b-4a1d-8105-bd1156211f7d\") " pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:44.601604 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.601560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:44.726023 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:44.725982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l"] Apr 16 17:07:44.728426 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:07:44.728396 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d315dc_af6b_4a1d_8105_bd1156211f7d.slice/crio-57a416d72ef2f23fc66ed5c7218117902d25cc33808af82bfe5874130fac439b WatchSource:0}: Error finding container 57a416d72ef2f23fc66ed5c7218117902d25cc33808af82bfe5874130fac439b: Status 404 returned error can't find the container with id 57a416d72ef2f23fc66ed5c7218117902d25cc33808af82bfe5874130fac439b Apr 16 17:07:45.246957 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:45.246890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" event={"ID":"92d315dc-af6b-4a1d-8105-bd1156211f7d","Type":"ContainerStarted","Data":"2ea38791ae24089d8ca250a6af167e2fd7dc6e4d69d4d3204f3e1b623ee66cac"} Apr 16 17:07:45.246957 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:45.246937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" event={"ID":"92d315dc-af6b-4a1d-8105-bd1156211f7d","Type":"ContainerStarted","Data":"57a416d72ef2f23fc66ed5c7218117902d25cc33808af82bfe5874130fac439b"} Apr 16 17:07:45.509597 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:45.509520 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f05150-500a-4d35-8473-a1c6ae6ae157" path="/var/lib/kubelet/pods/53f05150-500a-4d35-8473-a1c6ae6ae157/volumes" Apr 16 17:07:49.262141 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.262105 2576 generic.go:358] "Generic (PLEG): container finished" podID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerID="2ea38791ae24089d8ca250a6af167e2fd7dc6e4d69d4d3204f3e1b623ee66cac" exitCode=0 Apr 16 17:07:49.262530 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.262149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" event={"ID":"92d315dc-af6b-4a1d-8105-bd1156211f7d","Type":"ContainerDied","Data":"2ea38791ae24089d8ca250a6af167e2fd7dc6e4d69d4d3204f3e1b623ee66cac"} Apr 16 17:07:49.489917 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.489890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/1.log" Apr 16 17:07:49.490314 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.490298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/0.log" Apr 16 17:07:49.490375 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.490363 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:49.594236 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.594152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location\") pod \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " Apr 16 17:07:49.594236 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.594221 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert\") pod \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\" (UID: \"ca4c3af6-5616-42b6-9b89-ee283f3cf89b\") " Apr 16 17:07:49.594463 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.594439 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca4c3af6-5616-42b6-9b89-ee283f3cf89b" (UID: "ca4c3af6-5616-42b6-9b89-ee283f3cf89b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:07:49.594558 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.594536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ca4c3af6-5616-42b6-9b89-ee283f3cf89b" (UID: "ca4c3af6-5616-42b6-9b89-ee283f3cf89b"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:07:49.695546 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.695508 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-kserve-provision-location\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:07:49.695546 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:49.695541 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca4c3af6-5616-42b6-9b89-ee283f3cf89b-cabundle-cert\") on node \"ip-10-0-130-1.ec2.internal\" DevicePath \"\"" Apr 16 17:07:50.266876 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.266837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" event={"ID":"92d315dc-af6b-4a1d-8105-bd1156211f7d","Type":"ContainerStarted","Data":"6eb8387f8415dfe9dc0d9cc3165b16073647d00c8a4bd27060bd5f156b2ca6bf"} Apr 16 17:07:50.267365 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.267195 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:07:50.268078 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/1.log" Apr 16 17:07:50.268440 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268427 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5_ca4c3af6-5616-42b6-9b89-ee283f3cf89b/storage-initializer/0.log" Apr 16 17:07:50.268513 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268458 2576 generic.go:358] "Generic (PLEG): container finished" podID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerID="423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b" exitCode=1 Apr 16 17:07:50.268560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerDied","Data":"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b"} Apr 16 17:07:50.268560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" event={"ID":"ca4c3af6-5616-42b6-9b89-ee283f3cf89b","Type":"ContainerDied","Data":"d52fbe544174f4fef0221ec9f68a87a71dd1c84cf98a9d4347063a8d1cd52f0a"} Apr 16 17:07:50.268560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268550 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5" Apr 16 17:07:50.268685 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.268562 2576 scope.go:117] "RemoveContainer" containerID="423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b" Apr 16 17:07:50.269037 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.269003 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:07:50.277416 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.277400 2576 scope.go:117] "RemoveContainer" containerID="07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a" Apr 16 17:07:50.282495 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.282446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podStartSLOduration=6.28243171 podStartE2EDuration="6.28243171s" podCreationTimestamp="2026-04-16 17:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:07:50.281406329 +0000 UTC m=+1227.379285406" watchObservedRunningTime="2026-04-16 17:07:50.28243171 +0000 UTC m=+1227.380310785" Apr 16 17:07:50.285829 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.285809 2576 scope.go:117] "RemoveContainer" containerID="423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b" Apr 16 17:07:50.286128 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:50.286100 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b\": container with ID starting with 423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b not found: ID does not exist" containerID="423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b" Apr 16 17:07:50.286189 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.286139 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b"} err="failed to get container status \"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b\": rpc error: code = NotFound desc = could not find container \"423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b\": container with ID starting with 423c6032908525a289c9ff270cd09d5c2b6d9adc1d2149b5903f4e237496fe6b not found: ID does not exist" Apr 16 17:07:50.286189 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.286156 2576 scope.go:117] "RemoveContainer" containerID="07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a" Apr 16 17:07:50.286409 ip-10-0-130-1 kubenswrapper[2576]: E0416 17:07:50.286389 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a\": container with ID starting with 07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a not found: ID does not exist" containerID="07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a" Apr 16 17:07:50.286451 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.286415 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a"} err="failed to get container status \"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a\": rpc error: code = NotFound desc = could not find container \"07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a\": container with ID starting with 07bb4e8ea799d94de628b9c01e057551cff57dfd258f0c4d282d2240c5e1382a not found: ID does not exist" Apr 16 17:07:50.305703 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.305676 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:50.309022 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:50.308999 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ebb90-predictor-6b8ccc595-6qcz5"] Apr 16 17:07:51.274161 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:51.274123 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:07:51.511511 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:07:51.511463 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" path="/var/lib/kubelet/pods/ca4c3af6-5616-42b6-9b89-ee283f3cf89b/volumes" Apr 16 17:08:01.275151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:01.275102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:08:11.274807 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:11.274715 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:08:21.275127 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:21.275077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:08:31.274201 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:31.274148 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:08:41.275217 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:41.275172 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:08:51.274799 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:08:51.274757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" podUID="92d315dc-af6b-4a1d-8105-bd1156211f7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 17:09:01.275121 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:01.275089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-19113-predictor-6dbb9fd749-wfs4l" Apr 16 17:09:04.472000 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.471966 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs"] Apr 16 17:09:04.472341 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472289 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.472341 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472300 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.472341 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472308 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.472341 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472314 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.472467 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472370 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.472499 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.472468 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca4c3af6-5616-42b6-9b89-ee283f3cf89b" containerName="storage-initializer" Apr 16 17:09:04.475425 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.475408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:04.485038 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.485015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs"] Apr 16 17:09:04.646269 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.646232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32be801-b24c-4954-8d62-24bff5604915-kserve-provision-location\") pod \"raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs\" (UID: \"e32be801-b24c-4954-8d62-24bff5604915\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:04.747056 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.747028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32be801-b24c-4954-8d62-24bff5604915-kserve-provision-location\") pod \"raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs\" (UID: \"e32be801-b24c-4954-8d62-24bff5604915\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:04.747437 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.747413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32be801-b24c-4954-8d62-24bff5604915-kserve-provision-location\") pod \"raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs\" (UID: \"e32be801-b24c-4954-8d62-24bff5604915\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:04.786487 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.786466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:04.908491 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:04.908428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs"] Apr 16 17:09:04.910736 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:09:04.910710 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32be801_b24c_4954_8d62_24bff5604915.slice/crio-8ce2f8e04e802d9f826f5b42ca177778a71673ab7689a020c62b23edef434708 WatchSource:0}: Error finding container 8ce2f8e04e802d9f826f5b42ca177778a71673ab7689a020c62b23edef434708: Status 404 returned error can't find the container with id 8ce2f8e04e802d9f826f5b42ca177778a71673ab7689a020c62b23edef434708 Apr 16 17:09:05.533574 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:05.533541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" event={"ID":"e32be801-b24c-4954-8d62-24bff5604915","Type":"ContainerStarted","Data":"41d544b8ddcc4eb4141913777c4bc72a972d80da12973ef7edb54eb7f909fe15"} Apr 16 17:09:05.533574 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:05.533574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" event={"ID":"e32be801-b24c-4954-8d62-24bff5604915","Type":"ContainerStarted","Data":"8ce2f8e04e802d9f826f5b42ca177778a71673ab7689a020c62b23edef434708"} Apr 16 17:09:09.546278 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:09.546241 2576 generic.go:358] "Generic (PLEG): container finished" podID="e32be801-b24c-4954-8d62-24bff5604915" containerID="41d544b8ddcc4eb4141913777c4bc72a972d80da12973ef7edb54eb7f909fe15" exitCode=0 Apr 16 17:09:09.546660 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:09.546309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" event={"ID":"e32be801-b24c-4954-8d62-24bff5604915","Type":"ContainerDied","Data":"41d544b8ddcc4eb4141913777c4bc72a972d80da12973ef7edb54eb7f909fe15"} Apr 16 17:09:10.551033 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:10.550998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" event={"ID":"e32be801-b24c-4954-8d62-24bff5604915","Type":"ContainerStarted","Data":"9098e462a86dcca26e7fb294a3ef88e15e1dfbb59b6e09c62e413fd3371fab0e"} Apr 16 17:09:10.551492 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:10.551290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:09:10.552666 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:10.552638 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:09:10.567537 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:10.567490 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podStartSLOduration=6.567479118 podStartE2EDuration="6.567479118s" podCreationTimestamp="2026-04-16 17:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:09:10.566402183 +0000 UTC m=+1307.664281261" watchObservedRunningTime="2026-04-16 17:09:10.567479118 +0000 UTC m=+1307.665358193" Apr 16 17:09:11.554026 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:11.553989 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:09:21.554996 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:21.554935 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:09:31.554325 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:31.554285 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:09:41.554285 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:41.554199 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:09:51.554245 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:09:51.554187 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:10:01.555036 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:01.554985 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:10:11.554466 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:11.554422 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" podUID="e32be801-b24c-4954-8d62-24bff5604915" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 17:10:21.555133 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:21.555102 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs" Apr 16 17:10:39.188693 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.188664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:39.209315 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.209291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:39.218510 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.218487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:39.231424 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.231405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:39.240768 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.240750 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:39.249894 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.249870 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:39.262481 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.262463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:39.272584 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.272563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:39.309566 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.309539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:39.318827 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.318798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:39.332667 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.332646 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:39.342721 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:39.342696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:40.011746 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.011715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:40.022689 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.022666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:40.033483 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.033460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:40.050260 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.050230 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:40.059344 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.059325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:40.069509 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.069487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:40.082760 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.082739 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:40.092599 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.092574 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:40.120859 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.120838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:40.131552 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.131528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:40.145250 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.145221 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:40.155371 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.155354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:40.846266 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.846237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:40.856515 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.856474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:40.866621 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.866592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:40.879353 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.879324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:40.888249 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.888218 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:40.898282 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.898263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:40.910588 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.910569 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:40.919689 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.919670 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:40.945288 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.945265 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:40.954850 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.954814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:40.966743 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.966720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:40.976083 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:40.976058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:41.618538 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.618512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:41.629060 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.629038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:41.638433 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.638413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:41.650992 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.650961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:41.660372 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.660351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:41.669668 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.669650 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:41.682374 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.682344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:41.692077 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.692059 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:41.717384 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.717359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:41.726825 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.726804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:41.739373 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.739348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:41.748876 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:41.748857 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:42.388533 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.388497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:42.398176 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.398149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:42.407052 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.407027 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:42.421578 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.421534 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:42.430527 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.430506 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:42.439739 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.439707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:42.451430 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.451405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:42.460181 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.460155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:42.486100 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.486075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:42.495555 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.495533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:42.507820 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.507798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:42.516855 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:42.516834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:43.160727 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.160700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:43.173269 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.173242 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:43.184446 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.184419 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:43.198175 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.198132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:43.208500 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.208477 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:43.219656 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.219635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:43.234247 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.234224 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:43.245372 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.245344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:43.273184 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.273151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:43.284472 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.284449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:43.300443 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.300417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:43.315089 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.315061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:43.968871 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.968840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:43.979560 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.979535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:43.989543 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:43.989516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:44.002960 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.002924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:44.013885 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.013863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:44.025574 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.025539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:44.039185 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.039157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:44.048891 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.048864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:44.074736 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.074711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:44.086296 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.086272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:44.100347 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.100320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:44.110700 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.110682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:44.756621 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.756589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:44.768203 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.768175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:44.779326 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.779296 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:44.793607 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.793578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:44.804056 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.804031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:44.815185 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.815158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:44.829896 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.829874 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:44.839897 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.839880 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:44.867204 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.867177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:44.877924 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.877903 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:44.890790 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.890772 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:44.901357 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:44.901339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:45.560342 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.560317 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:45.570767 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.570743 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:45.580788 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.580751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:45.596076 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.596056 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:45.605825 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.605803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:45.615502 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.615475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:45.628288 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.628267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:45.637615 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.637594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:45.663219 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.663199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:45.672860 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.672836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:45.685520 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.685497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:45.695634 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:45.695608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:46.360708 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.360679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:46.371023 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.370995 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:46.382714 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.382692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:46.396117 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.396097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:46.406985 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.406964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:46.416665 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.416645 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:46.430194 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.430171 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:46.440801 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.440779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:46.468717 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.468680 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:46.478697 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.478673 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:46.491743 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.491714 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:46.503227 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:46.503206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:47.219398 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.219368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:47.229870 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.229844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:47.239137 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.239120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:47.254578 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.254560 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:47.264768 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.264724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:47.274706 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.274688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:47.290614 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.290593 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:47.302437 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.302418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:47.332165 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.332147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:47.342313 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.342290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:47.356101 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.356084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:47.368194 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:47.368178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:48.080160 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.080132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:48.091430 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.091408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:48.103974 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.103937 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:48.117044 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.117026 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:48.126875 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.126848 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:48.136919 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.136899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:48.149775 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.149754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:48.159637 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.159614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:48.185708 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.185677 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:48.195747 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.195720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:48.213226 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.213198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:48.223016 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.222993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:48.876658 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.876621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:48.888199 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.888170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:48.903407 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.903384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:48.938210 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.938187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:48.968594 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.968563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:48.979419 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.979396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:48.992905 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:48.992882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:49.009288 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.009263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:49.038670 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.038649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:49.051236 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.051209 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:49.064799 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.064769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:49.075245 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.075227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:49.755096 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.755067 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:49.765978 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.765935 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:49.776194 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.776170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:49.793402 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.793378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:49.810451 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.810426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:49.825259 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.825238 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:49.843409 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.843390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:49.859139 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.859114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:49.898471 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.898441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:49.914258 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.914236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:49.931814 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.931790 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:49.947348 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:49.947325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:50.596577 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.596553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:50.610546 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.610520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:50.626213 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.626189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:50.643332 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.643305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:50.659394 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.659371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:50.675336 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.675311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:50.692388 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.692364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:50.708234 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.708210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:50.746788 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.746763 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:50.762295 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.762269 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:50.782459 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.782434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:50.798206 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:50.798184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:51.464666 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.464637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:51.482379 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.482352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:51.502595 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.502570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:51.521404 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.521378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:51.539139 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.539111 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:51.555258 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.555233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:51.571960 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.571918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:51.587217 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.587193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:51.624667 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.624640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:51.640314 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.640291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:51.659463 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.659436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:51.675252 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:51.675226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:52.349667 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.349638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:52.363675 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.363645 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:52.383304 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.383285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:52.408323 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.408298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:52.446667 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.446642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:52.483054 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.483030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:52.514480 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.514456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:52.536745 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.536718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:52.577185 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.577162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:52.596510 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.596482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:52.619900 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.619834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:52.643494 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:52.643475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:53.404868 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.404841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/kserve-container/0.log" Apr 16 17:10:53.418679 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.418650 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/agent/0.log" Apr 16 17:10:53.430219 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.430195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-logger-raw-dc743-predictor-94bcc8d69-6pdkj_2086a436-bded-4cab-b596-ed5ffdade66e/storage-initializer/0.log" Apr 16 17:10:53.447350 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.447329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/kserve-container/0.log" Apr 16 17:10:53.461893 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.461872 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/agent/0.log" Apr 16 17:10:53.484809 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.484789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-4801b-predictor-78465c5d67-kxgcw_e3bcf0d4-3673-4f6b-a638-04809c5e5256/storage-initializer/0.log" Apr 16 17:10:53.501200 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.501178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/kserve-container/0.log" Apr 16 17:10:53.517702 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.517681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-scale-raw-ae117-predictor-5db48cf96c-rqscw_2e9ca9e5-8d6f-44e8-a9d0-147483751715/storage-initializer/0.log" Apr 16 17:10:53.567602 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.567579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/kserve-container/0.log" Apr 16 17:10:53.587635 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.587608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-19113-predictor-6dbb9fd749-wfs4l_92d315dc-af6b-4a1d-8105-bd1156211f7d/storage-initializer/0.log" Apr 16 17:10:53.606634 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.606599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/kserve-container/0.log" Apr 16 17:10:53.620695 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:53.620674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_raw-sklearn-runtime-aa36c-predictor-5b67688557-qw2cs_e32be801-b24c-4954-8d62-24bff5604915/storage-initializer/0.log" Apr 16 17:10:58.637452 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:58.637421 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cwn6w_185db826-8d91-4046-9d72-6213e3ded5af/global-pull-secret-syncer/0.log" Apr 16 17:10:58.722646 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:58.722617 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-znpbx_a59ddb8d-570d-4782-afa6-1d2e490cf42f/konnectivity-agent/0.log" Apr 16 17:10:58.764248 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:10:58.764216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-1.ec2.internal_b2d196e213c72dc0e82e18661b4a3b77/haproxy/0.log" Apr 16 17:11:02.415918 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.415891 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-64f884946d-gq5fq_a07c2b8a-d52e-4075-8340-3571cb362506/metrics-server/0.log" Apr 16 17:11:02.489408 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.489376 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4q5ss_686536bf-e737-40fa-8d47-a76a6dfd56c5/node-exporter/0.log" Apr 16 17:11:02.517465 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.517440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4q5ss_686536bf-e737-40fa-8d47-a76a6dfd56c5/kube-rbac-proxy/0.log" Apr 16 17:11:02.547544 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.547519 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4q5ss_686536bf-e737-40fa-8d47-a76a6dfd56c5/init-textfile/0.log" Apr 16 17:11:02.746596 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.746570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bqs9x_3e5c992b-b27e-4f83-bb52-d788e02e6097/kube-rbac-proxy-main/0.log" Apr 16 17:11:02.770996 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.770974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bqs9x_3e5c992b-b27e-4f83-bb52-d788e02e6097/kube-rbac-proxy-self/0.log" Apr 16 17:11:02.794653 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:02.794621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-bqs9x_3e5c992b-b27e-4f83-bb52-d788e02e6097/openshift-state-metrics/0.log" Apr 16 17:11:03.061321 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.061234 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-dn2q7_43b34e2c-035c-4c11-806c-40bf965f3ab2/prometheus-operator-admission-webhook/0.log" Apr 16 17:11:03.173475 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.173449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/thanos-query/0.log" Apr 16 17:11:03.196979 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.196930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/kube-rbac-proxy-web/0.log" Apr 16 17:11:03.219902 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.219877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/kube-rbac-proxy/0.log" Apr 16 17:11:03.241284 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.241251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/prom-label-proxy/0.log" Apr 16 17:11:03.263723 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.263702 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/kube-rbac-proxy-rules/0.log" Apr 16 17:11:03.283616 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:03.283600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58d7f4698b-x74k9_c24cdb38-9b30-40da-a8f8-93f65dd132b9/kube-rbac-proxy-metrics/0.log" Apr 16 17:11:04.455171 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:04.455141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-4tmzr_7a1cc086-1280-418f-b306-f49c2436ad0c/networking-console-plugin/0.log" Apr 16 17:11:05.245591 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:05.245563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b9c4cb5d-8h8t8_d409e915-8e1f-44de-ad55-ad3b405d0ae9/console/0.log" Apr 16 17:11:05.276501 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:05.276474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-c2zvb_b8fcc185-91e6-436f-8394-c2766f17f7a6/download-server/0.log" Apr 16 17:11:06.270464 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.270441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2j894_1997f718-c32f-43e2-8412-29f59bf82303/dns/0.log" Apr 16 17:11:06.289965 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.289919 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2j894_1997f718-c32f-43e2-8412-29f59bf82303/kube-rbac-proxy/0.log" Apr 16 17:11:06.400924 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.400890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4l9b9_51e9a221-2ee4-44c8-bb3a-29addd9e2fe5/dns-node-resolver/0.log" Apr 16 17:11:06.670873 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.670793 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr"] Apr 16 17:11:06.674542 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.674520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.676904 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.676881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"kube-root-ca.crt\"" Apr 16 17:11:06.678168 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.678144 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fspv\"/\"default-dockercfg-f9wr2\"" Apr 16 17:11:06.678340 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.678221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"openshift-service-ca.crt\"" Apr 16 17:11:06.680273 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.680225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr"] Apr 16 17:11:06.821533 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.821494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-proc\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.821709 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.821546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-podres\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.821709 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.821578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-sys\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.821709 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.821640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx2d8\" (UniqueName: \"kubernetes.io/projected/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-kube-api-access-nx2d8\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.821709 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.821663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-lib-modules\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.922852 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-sys\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.922852 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nx2d8\" (UniqueName: \"kubernetes.io/projected/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-kube-api-access-nx2d8\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.922852 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-lib-modules\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-proc\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-podres\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.922903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-sys\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.923034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-podres\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.923043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-lib-modules\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.923151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.923034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-proc\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.930481 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.930456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx2d8\" (UniqueName: \"kubernetes.io/projected/d3b1abf1-3bce-4c0a-b2e3-652a58d08663-kube-api-access-nx2d8\") pod \"perf-node-gather-daemonset-z4wcr\" (UID: \"d3b1abf1-3bce-4c0a-b2e3-652a58d08663\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:06.935746 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.935724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k2zl5_c32688da-6a07-4fb0-a11d-64239ab022f4/node-ca/0.log" Apr 16 17:11:06.985310 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:06.985280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:07.113503 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.113477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr"] Apr 16 17:11:07.116244 ip-10-0-130-1 kubenswrapper[2576]: W0416 17:11:07.116213 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3b1abf1_3bce_4c0a_b2e3_652a58d08663.slice/crio-1d8f84b8d5b10ebc181464e52154203443dd44d3bfb0d241615f9303bf530dcd WatchSource:0}: Error finding container 1d8f84b8d5b10ebc181464e52154203443dd44d3bfb0d241615f9303bf530dcd: Status 404 returned error can't find the container with id 1d8f84b8d5b10ebc181464e52154203443dd44d3bfb0d241615f9303bf530dcd Apr 16 17:11:07.117924 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.117905 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:11:07.958178 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.958136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" event={"ID":"d3b1abf1-3bce-4c0a-b2e3-652a58d08663","Type":"ContainerStarted","Data":"fe5616316116f9f135f3929f865d437df585d08c1b14fe8ee53bebdb2224e96c"} Apr 16 17:11:07.958178 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.958173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" event={"ID":"d3b1abf1-3bce-4c0a-b2e3-652a58d08663","Type":"ContainerStarted","Data":"1d8f84b8d5b10ebc181464e52154203443dd44d3bfb0d241615f9303bf530dcd"} Apr 16 17:11:07.958599 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.958315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:07.973448 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.973397 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" podStartSLOduration=1.973383374 podStartE2EDuration="1.973383374s" podCreationTimestamp="2026-04-16 17:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:11:07.972490697 +0000 UTC m=+1425.070369774" watchObservedRunningTime="2026-04-16 17:11:07.973383374 +0000 UTC m=+1425.071262449" Apr 16 17:11:07.992494 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:07.992465 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rjlc7_2c428e0a-19c4-4e80-ba25-9b1be39d973e/serve-healthcheck-canary/0.log" Apr 16 17:11:08.402373 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:08.402345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sqgtm_372066d1-c5d3-423a-90bc-f98568ca34a8/kube-rbac-proxy/0.log" Apr 16 17:11:08.421966 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:08.421924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sqgtm_372066d1-c5d3-423a-90bc-f98568ca34a8/exporter/0.log" Apr 16 17:11:08.441526 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:08.441496 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sqgtm_372066d1-c5d3-423a-90bc-f98568ca34a8/extractor/0.log" Apr 16 17:11:10.509915 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:10.509885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-2kchp_f7532813-1d19-4981-890a-56b49e813729/manager/0.log" Apr 16 17:11:10.554475 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:10.554444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-vzt88_115ba5ea-a060-4e01-8199-61d6751e7468/seaweedfs/0.log" Apr 16 17:11:13.969590 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:13.969561 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qlwwb_b3927737-e108-4def-a580-b80f2c3b48b6/migrator/0.log" Apr 16 17:11:13.972147 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:13.972122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-z4wcr" Apr 16 17:11:13.995443 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:13.995414 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qlwwb_b3927737-e108-4def-a580-b80f2c3b48b6/graceful-termination/0.log" Apr 16 17:11:14.305413 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:14.305378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-tqhcx_c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe/kube-storage-version-migrator-operator/1.log" Apr 16 17:11:14.306253 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:14.306227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-tqhcx_c7ba38c2-f795-4fd6-9d38-17e6a32e0bbe/kube-storage-version-migrator-operator/0.log" Apr 16 17:11:15.091090 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.091061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/kube-multus-additional-cni-plugins/0.log" Apr 16 17:11:15.113716 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.113689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/egress-router-binary-copy/0.log" Apr 16 17:11:15.134309 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.134285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/cni-plugins/0.log" Apr 16 17:11:15.154459 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.154427 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/bond-cni-plugin/0.log" Apr 16 17:11:15.174567 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.174534 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/routeoverride-cni/0.log" Apr 16 17:11:15.194137 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.194099 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/whereabouts-cni-bincopy/0.log" Apr 16 17:11:15.217120 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.217094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4l5qj_20a28d9b-77a3-42fd-bd13-72cb783d8673/whereabouts-cni/0.log" Apr 16 17:11:15.601866 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.601834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xgfzz_9e2838a2-f0a3-4285-86fb-f54be274ccfa/kube-multus/0.log" Apr 16 17:11:15.666643 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.666615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dc7qq_0f7cce27-fc9e-437d-9147-a82b82151b07/network-metrics-daemon/0.log" Apr 16 17:11:15.686166 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:15.686136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dc7qq_0f7cce27-fc9e-437d-9147-a82b82151b07/kube-rbac-proxy/0.log" Apr 16 17:11:16.817201 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.817172 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-controller/0.log" Apr 16 17:11:16.834353 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.834281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/0.log" Apr 16 17:11:16.840373 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.840348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovn-acl-logging/1.log" Apr 16 17:11:16.859471 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.859440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/kube-rbac-proxy-node/0.log" Apr 16 17:11:16.880686 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.880653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:11:16.900873 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.900845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/northd/0.log" Apr 16 17:11:16.922151 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.922128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/nbdb/0.log" Apr 16 17:11:16.943633 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:16.943609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/sbdb/0.log" Apr 16 17:11:17.042069 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:17.042032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6dbws_a4267203-3686-4c79-a755-afbc3279763c/ovnkube-controller/0.log" Apr 16 17:11:18.371318 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:18.371286 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b2qz6_3e57de2a-0cfa-4859-bb21-132d521252f0/network-check-target-container/0.log" Apr 16 17:11:19.372553 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:19.372522 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9xzz6_06a3bd25-03ba-42cc-8d7f-5aac5e0fc674/iptables-alerter/0.log" Apr 16 17:11:20.007643 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:20.007609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8fpkc_477dec0e-139a-477a-85c0-0229e6e2398d/tuned/0.log" Apr 16 17:11:23.381694 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:23.381668 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7fxqg_810ea4ed-899e-4f5f-908f-30d9aff93364/csi-driver/0.log" Apr 16 17:11:23.401310 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:23.401270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7fxqg_810ea4ed-899e-4f5f-908f-30d9aff93364/csi-node-driver-registrar/0.log" Apr 16 17:11:23.420161 ip-10-0-130-1 kubenswrapper[2576]: I0416 17:11:23.420137 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-7fxqg_810ea4ed-899e-4f5f-908f-30d9aff93364/csi-liveness-probe/0.log"