Apr 17 20:41:20.587165 ip-10-0-139-255 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:41:20.587177 ip-10-0-139-255 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:41:20.587186 ip-10-0-139-255 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:41:20.587502 ip-10-0-139-255 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:41:30.609488 ip-10-0-139-255 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:41:30.609504 ip-10-0-139-255 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f39f5112b6f64b1cb486aadbb39c58a7 -- Apr 17 20:43:54.099745 ip-10-0-139-255 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:43:54.466468 ip-10-0-139-255 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:43:54.466468 ip-10-0-139-255 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:43:54.466468 ip-10-0-139-255 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:43:54.467042 ip-10-0-139-255 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:43:54.467042 ip-10-0-139-255 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:43:54.467926 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.467848 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:43:54.470078 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470063 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:43:54.470078 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470078 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470082 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470085 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470088 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470091 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470094 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470097 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470100 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470103 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470106 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470109 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470111 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470114 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470117 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470119 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470122 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470124 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470127 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470130 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470132 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:43:54.470139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470138 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470141 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470144 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470146 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470150 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470153 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470156 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470158 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470161 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470164 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470166 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470169 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470172 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470175 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470177 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470180 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470182 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470185 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470187 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:43:54.470635 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470190 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470193 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470195 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470197 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470200 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470202 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470205 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470208 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470212 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470216 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470219 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470222 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470224 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470227 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470230 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470233 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470237 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470241 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470244 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:43:54.471139 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470248 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470251 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470253 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470256 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470259 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470261 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470264 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470266 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470269 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470271 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470274 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470276 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470279 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470283 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470286 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470288 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470291 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470294 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470296 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470299 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:43:54.471624 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470302 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470304 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470307 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470309 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470312 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470315 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470320 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470715 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470722 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470725 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470728 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470731 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470733 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470736 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470739 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470741 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470744 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470746 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470749 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470751 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:43:54.472102 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470754 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470756 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470759 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470761 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470764 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470767 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470769 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470772 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470775 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470778 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470780 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470783 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470785 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470789 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470802 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470807 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470811 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470814 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470817 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:43:54.472597 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470820 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470823 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470827 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470829 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470832 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470834 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470837 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470840 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470843 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470845 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470848 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470850 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470853 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470855 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470858 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470860 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470863 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470866 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470868 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470871 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:43:54.473063 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470873 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470876 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470879 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470882 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470884 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470887 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470889 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470892 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470894 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470897 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470899 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470902 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470904 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470908 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470911 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470913 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470915 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470918 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470921 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470924 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:43:54.473572 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470926 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470928 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470931 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470934 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470936 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470939 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470941 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470944 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470946 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470951 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470953 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470956 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470958 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.470961 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471030 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471038 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471045 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471049 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471053 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471056 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471061 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:43:54.474059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471065 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471068 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471071 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471075 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471078 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471081 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471085 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471088 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471091 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471094 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471096 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471099 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471104 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471106 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471110 2567 flags.go:64] FLAG: --config-dir="" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471112 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471116 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471120 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471123 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471126 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471130 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471133 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471136 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471140 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471143 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:43:54.474593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471146 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471151 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471154 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471158 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471161 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471164 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471167 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471171 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471175 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471177 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471180 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471183 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471187 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471190 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471193 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471196 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471199 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471202 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471205 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471208 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471211 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471213 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471216 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471220 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471223 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:43:54.475211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471226 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471229 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471232 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471235 2567 flags.go:64] FLAG: --help="false" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471238 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471241 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471244 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471247 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471251 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471254 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471257 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471260 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471263 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471265 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471268 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471271 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471274 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471277 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471280 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471283 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471286 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471289 2567 flags.go:64] FLAG: --lock-file="" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471292 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471295 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:43:54.475900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471298 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471303 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471306 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471308 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471311 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471314 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471317 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471320 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471323 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471327 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471330 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471334 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471337 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471340 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471343 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471346 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471349 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471352 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471355 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471364 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471367 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471370 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471373 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:43:54.476534 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471376 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471382 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471385 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471388 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471391 2567 flags.go:64] FLAG: --port="10250" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471395 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471397 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-056f7ef53f9d4f25e" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471401 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471404 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471407 2567 flags.go:64] FLAG: --register-node="true" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471410 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471412 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471416 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471419 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471422 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471424 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471428 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471431 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471434 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471438 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471441 2567 flags.go:64] FLAG: --runonce="false" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471443 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471459 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471462 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471466 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471469 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:43:54.477078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471472 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471475 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471478 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471481 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471484 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471487 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471490 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471493 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471496 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471499 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471505 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471507 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471510 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471514 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471517 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471520 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471523 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471525 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471528 2567 flags.go:64] FLAG: --v="2" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471533 2567 flags.go:64] FLAG: --version="false" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471537 2567 flags.go:64] FLAG: --vmodule="" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471541 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.471544 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471628 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471632 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:43:54.477753 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471636 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471639 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471642 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471645 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471648 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471650 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471653 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471658 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471661 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471664 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471667 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471674 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471677 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471680 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471682 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471685 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471688 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471690 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471693 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:43:54.478372 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471695 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471699 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471703 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471706 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471708 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471711 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471713 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471716 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471718 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471721 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471723 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471728 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471731 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471733 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471736 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471738 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471741 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471744 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471746 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471749 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:43:54.478883 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471751 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471754 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471756 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471759 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471762 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471764 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471767 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471769 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471772 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471774 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471777 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471779 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471782 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471784 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471787 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471789 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471795 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471797 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471800 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471803 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:43:54.479386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471805 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471807 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471810 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471814 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471817 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471819 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471822 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471825 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471827 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471830 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471832 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471835 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471837 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471840 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471842 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471845 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471847 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471850 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471860 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471863 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:43:54.479919 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471866 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471868 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471871 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471873 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.471876 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.472472 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.478603 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.478619 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478666 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478671 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478674 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478677 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478680 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478683 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478686 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:43:54.480415 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478689 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478691 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478694 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478697 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478700 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478702 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478705 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478707 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478710 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478714 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478718 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478721 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478724 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478727 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478730 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478733 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478736 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478738 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478741 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478744 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:43:54.480814 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478746 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478749 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478752 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478755 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478760 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478762 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478765 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478768 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478770 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478773 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478775 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478778 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478780 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478783 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478785 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478788 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478791 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478801 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478804 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478807 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:43:54.481304 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478810 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478812 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478815 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478818 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478820 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478823 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478825 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478828 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478830 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478833 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478836 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478838 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478840 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478844 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478847 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478850 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478854 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478858 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478861 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:43:54.481813 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478864 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478867 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478869 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478872 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478874 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478877 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478879 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478882 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478884 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478887 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478890 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478892 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478895 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478898 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478901 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478903 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478906 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478908 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478911 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:43:54.482275 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.478913 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.478919 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479035 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479040 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479044 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479047 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479050 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479053 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479055 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479059 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479063 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479066 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479069 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479072 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479075 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:43:54.482757 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479077 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479080 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479082 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479085 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479087 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479089 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479092 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479094 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479097 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479099 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479102 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479105 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479107 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479110 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479113 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479115 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479118 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479120 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479123 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479125 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:43:54.483144 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479128 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479130 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479133 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479135 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479138 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479141 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479143 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479146 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479148 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479150 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479154 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479156 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479159 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479161 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479164 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479166 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479169 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479171 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479174 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:43:54.483645 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479176 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479179 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479181 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479183 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479186 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479189 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479192 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479194 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479197 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479199 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479202 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479204 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479207 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479209 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479212 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479215 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479217 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479220 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479222 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:43:54.484107 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479225 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479228 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479231 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479234 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479236 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479239 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479242 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479244 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479247 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479249 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479252 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479254 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479257 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479259 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:54.479261 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.479266 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:43:54.484623 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.479965 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:43:54.485019 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.484229 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:43:54.485150 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.485138 2567 server.go:1019] "Starting client certificate rotation" Apr 17 20:43:54.485255 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.485239 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:43:54.485310 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.485292 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:43:54.506539 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.506519 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:43:54.511423 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.511404 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:43:54.524599 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.524583 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:43:54.529995 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.529978 2567 log.go:25] "Validated CRI v1 image API" Apr 17 20:43:54.531278 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.531257 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:43:54.534745 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.534724 2567 fs.go:135] Filesystem UUIDs: map[174afb8d-23be-419d-a45d-96221438f0dc:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fee455f0-30ee-49d8-9a71-6ae0b49333c7:/dev/nvme0n1p3] Apr 17 20:43:54.534818 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.534744 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:43:54.535604 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.535586 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:43:54.539536 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.539411 2567 manager.go:217] Machine: {Timestamp:2026-04-17 20:43:54.538350903 +0000 UTC m=+0.337229060 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100884 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c45d38d983cdc09dabf0998f0d420 SystemUUID:ec2c45d3-8d98-3cdc-09da-bf0998f0d420 BootID:f39f5112-b6f6-4b1c-b486-aadbb39c58a7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9d:f2:55:d1:99 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9d:f2:55:d1:99 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:2f:1b:b3:22:5a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:43:54.539536 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.539531 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:43:54.539644 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.539633 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:43:54.540653 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540630 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:43:54.540795 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540655 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-255.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:43:54.540843 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540805 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:43:54.540843 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540813 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:43:54.540843 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540829 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:43:54.540922 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.540846 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:43:54.542258 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.542248 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:43:54.542364 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.542356 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:43:54.544252 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.544243 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:43:54.544293 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.544256 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:43:54.544848 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.544839 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:43:54.544881 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.544851 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:43:54.544881 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.544859 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:43:54.545796 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.545785 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:43:54.545841 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.545803 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:43:54.548285 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.548262 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:43:54.550206 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.550193 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:43:54.551412 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551400 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551418 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551425 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551430 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551436 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551442 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551467 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:43:54.551477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551475 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:43:54.551700 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551482 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:43:54.551700 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551488 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:43:54.551700 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551508 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:43:54.551700 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.551516 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:43:54.553593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.553547 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:43:54.553805 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.553792 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:43:54.558211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.558197 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:43:54.558297 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.558285 2567 server.go:1295] "Started kubelet" Apr 17 20:43:54.558461 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.558354 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:43:54.558919 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.558874 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:43:54.559003 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.558942 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:43:54.559392 ip-10-0-139-255 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:43:54.561576 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.561554 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:43:54.561688 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.561652 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:43:54.561765 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.561715 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:43:54.561812 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.561770 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-255.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:43:54.562305 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.562291 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:43:54.567199 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.567180 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:43:54.567520 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.567503 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:43:54.567601 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.567523 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:43:54.568023 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.566844 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-255.ec2.internal.18a73fb74f047c75 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-255.ec2.internal,UID:ip-10-0-139-255.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-255.ec2.internal,},FirstTimestamp:2026-04-17 20:43:54.558209141 +0000 UTC m=+0.357087298,LastTimestamp:2026-04-17 20:43:54.558209141 +0000 UTC m=+0.357087298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-255.ec2.internal,}" Apr 17 20:43:54.568118 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568104 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:43:54.568118 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568115 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:43:54.568218 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568130 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:43:54.568277 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568261 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:43:54.568277 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568269 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:43:54.568277 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.568258 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.568396 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568388 2567 factory.go:55] Registering systemd factory Apr 17 20:43:54.568463 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568431 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:43:54.568694 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568681 2567 factory.go:153] Registering CRI-O factory Apr 17 20:43:54.568742 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568696 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 20:43:54.568774 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568749 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:43:54.568822 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568780 2567 factory.go:103] Registering Raw factory Apr 17 20:43:54.568822 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.568796 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 20:43:54.569281 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.569269 2567 manager.go:319] Starting recovery of all containers Apr 17 20:43:54.573740 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.573691 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 20:43:54.573849 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.573824 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 20:43:54.577657 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.577636 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8lzgx" Apr 17 20:43:54.579050 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.579035 2567 manager.go:324] Recovery completed Apr 17 20:43:54.580622 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.580605 2567 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 20:43:54.584802 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.584788 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.584882 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.584853 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8lzgx" Apr 17 20:43:54.590545 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.590529 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.590632 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.590562 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.590632 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.590576 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.591104 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.591091 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:43:54.591152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.591104 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:43:54.591152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.591120 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:43:54.592588 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.592526 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-255.ec2.internal.18a73fb750f1e69d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-255.ec2.internal,UID:ip-10-0-139-255.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-255.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-255.ec2.internal,},FirstTimestamp:2026-04-17 20:43:54.590545565 +0000 UTC m=+0.389423728,LastTimestamp:2026-04-17 20:43:54.590545565 +0000 UTC m=+0.389423728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-255.ec2.internal,}" Apr 17 20:43:54.594593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.594580 2567 policy_none.go:49] "None policy: Start" Apr 17 20:43:54.594644 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.594597 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:43:54.594644 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.594607 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:43:54.632111 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.631986 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.632147 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632163 2567 server.go:85] "Starting device plugin registration server" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632348 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632359 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632473 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632554 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.632564 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.633123 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:43:54.651638 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.633157 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.706049 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.706023 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:43:54.707248 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.707230 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:43:54.707318 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.707256 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:43:54.707318 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.707274 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:43:54.707318 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.707281 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:43:54.707481 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.707349 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:43:54.710734 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.710713 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:43:54.732895 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.732841 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.733727 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.733714 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.733801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.733743 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.733801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.733755 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.733801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.733789 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.742389 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.742376 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.742431 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.742395 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-255.ec2.internal\": node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.753965 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.753946 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.807896 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.807866 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal"] Apr 17 20:43:54.807986 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.807944 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.809599 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.809576 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.809694 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.809608 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.809694 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.809621 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.811963 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.811949 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.812109 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812094 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.812169 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812128 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.812671 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812655 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.812744 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812681 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.812744 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812694 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.812744 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812717 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.812744 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812739 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.812922 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.812749 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.814983 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.814968 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.815059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.814998 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:43:54.815643 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.815619 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:43:54.815735 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.815649 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:43:54.815735 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.815667 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:43:54.839993 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.839973 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-255.ec2.internal\" not found" node="ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.844180 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.844165 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-255.ec2.internal\" not found" node="ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.854782 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.854762 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.870024 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.870006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.870072 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.870029 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.870072 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.870049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e250693f0814c5dff374e113e490f4a6-config\") pod \"kube-apiserver-proxy-ip-10-0-139-255.ec2.internal\" (UID: \"e250693f0814c5dff374e113e490f4a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.955840 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:54.955813 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:54.970163 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.970210 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.970210 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.970210 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e250693f0814c5dff374e113e490f4a6-config\") pod \"kube-apiserver-proxy-ip-10-0-139-255.ec2.internal\" (UID: \"e250693f0814c5dff374e113e490f4a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.970298 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e250693f0814c5dff374e113e490f4a6-config\") pod \"kube-apiserver-proxy-ip-10-0-139-255.ec2.internal\" (UID: \"e250693f0814c5dff374e113e490f4a6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:54.970298 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:54.970274 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a540b23d926990aa9314e724ab32d20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal\" (UID: \"4a540b23d926990aa9314e724ab32d20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.056577 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.056524 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:55.142034 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.142007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.146575 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.146560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.157136 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.157118 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:55.258204 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.258173 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:55.358708 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.358651 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-255.ec2.internal\" not found" Apr 17 20:43:55.399640 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.399615 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:43:55.468527 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.468499 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.481699 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.481676 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:43:55.483369 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.483352 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.484998 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.484984 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:43:55.485107 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.485092 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:43:55.485155 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.485141 2567 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://aa3d4f826af064136b9c4bb5337cfaf8-a595c5d6c5780cb6.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.139.255:54698->52.6.139.9:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:43:55.545955 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.545933 2567 apiserver.go:52] "Watching apiserver" Apr 17 20:43:55.551293 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.551274 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:43:55.551601 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.551583 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-q4xbt","openshift-image-registry/node-ca-fmmlt","openshift-multus/multus-2bn4l","openshift-multus/multus-additional-cni-plugins-tz6kr","openshift-multus/network-metrics-daemon-mxwcv","openshift-network-diagnostics/network-check-target-mpmw8","openshift-network-operator/iptables-alerter-knzfb","openshift-ovn-kubernetes/ovnkube-node-fdrzh","kube-system/konnectivity-agent-rjk22","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp","openshift-dns/node-resolver-bcf4q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal"] Apr 17 20:43:55.556543 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.556524 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.558161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.558142 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.558248 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.558142 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.558248 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.558205 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kbs4v\"" Apr 17 20:43:55.558597 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.558582 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.560196 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560179 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.560295 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560279 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s5rtq\"" Apr 17 20:43:55.560295 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.560413 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560290 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:43:55.560748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560732 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.560976 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.560838 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.562299 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562285 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4bp6l\"" Apr 17 20:43:55.562380 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562367 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:43:55.562741 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562723 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:43:55.562741 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562738 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:43:55.562887 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562771 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:43:55.562887 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562740 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8dbbt\"" Apr 17 20:43:55.562887 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562791 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.562887 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.562794 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.565164 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.565150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.565238 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.565210 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:43:55.567330 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.567311 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.567446 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.567429 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.567593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.567568 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:43:55.569148 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569129 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:43:55.569230 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569214 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:43:55.569280 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569231 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.569467 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569437 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.569617 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569531 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.569617 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569539 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.569617 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569544 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7zmnk\"" Apr 17 20:43:55.569754 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.569810 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.569795 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xdkzp\"" Apr 17 20:43:55.571144 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.571131 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.571323 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.571311 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.571403 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.571391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-694gh\"" Apr 17 20:43:55.571964 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.571952 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:55.572014 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.572000 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:43:55.574154 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574130 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tlt\" (UniqueName: \"kubernetes.io/projected/47702967-5f03-40a5-b1ae-9f6930a86290-kube-api-access-l2tlt\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.574201 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-run\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574201 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ae2eb0-2562-4952-a3e2-66786045ebd7-host\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.574201 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4rl\" (UniqueName: \"kubernetes.io/projected/71ae2eb0-2562-4952-a3e2-66786045ebd7-kube-api-access-zr4rl\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.574336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574212 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.574336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574234 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-multus\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.574336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574308 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.574336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-socket-dir-parent\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574354 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-sys\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-conf\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574432 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-tmp\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-socket-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-k8s-cni-cncf-io\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574518 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-kubernetes\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6bv4\" (UniqueName: \"kubernetes.io/projected/bfeb0b83-e72b-4e99-a672-fe8226b4c276-kube-api-access-b6bv4\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cnibin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574593 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-kubelet\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574657 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-device-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-sys-fs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574710 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71ae2eb0-2562-4952-a3e2-66786045ebd7-serviceca\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-system-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/47702967-5f03-40a5-b1ae-9f6930a86290-iptables-alerter-script\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-modprobe-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.574916 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574909 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-var-lib-kubelet\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574943 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-host\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.574996 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ad25b90-ed3e-4976-b701-b30fbe6881cd-hosts-file\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-lib-modules\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575037 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-tuned\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gth9k\" (UniqueName: \"kubernetes.io/projected/b84b134c-9465-48d2-b811-36203ae88de2-kube-api-access-gth9k\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47702967-5f03-40a5-b1ae-9f6930a86290-host-slash\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575094 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575108 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-os-release\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hlsr\" (UniqueName: \"kubernetes.io/projected/2aad12b0-2520-4cf5-bc30-a332be05db03-kube-api-access-8hlsr\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575167 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-registration-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575197 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-os-release\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575226 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-netns\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-conf-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-daemon-config\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ad25b90-ed3e-4976-b701-b30fbe6881cd-tmp-dir\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cni-binary-copy\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575366 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-multus-certs\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-hostroot\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lqpw\" (UniqueName: \"kubernetes.io/projected/0ad25b90-ed3e-4976-b701-b30fbe6881cd-kube-api-access-9lqpw\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575442 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vdrs\" (UniqueName: \"kubernetes.io/projected/ecf62ce9-60d6-401a-a38e-898d286d58d4-kube-api-access-2vdrs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-systemd\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk5sc\" (UniqueName: \"kubernetes.io/projected/95bd04a6-fb3b-498b-bf3e-7b047bad740d-kube-api-access-tk5sc\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-system-cni-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-cnibin\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysconfig\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.575748 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.575616 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-bin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.577056 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-etc-kubernetes\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.577142 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.577204 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577179 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.577256 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577236 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:43:55.577306 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577258 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rgfn6\"" Apr 17 20:43:55.577358 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577318 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:43:55.577867 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577631 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:43:55.577867 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.577750 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:43:55.579761 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.578530 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:43:55.579761 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.578608 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:43:55.579761 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.578924 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:43:55.579761 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.579343 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-spbq6\"" Apr 17 20:43:55.581148 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.581123 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:43:55.581224 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.581130 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:43:55.586265 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.586240 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:38:54 +0000 UTC" deadline="2028-01-09 21:15:55.33470543 +0000 UTC" Apr 17 20:43:55.586265 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.586262 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15168h31m59.748445447s" Apr 17 20:43:55.604124 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.604106 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-njxhk" Apr 17 20:43:55.612663 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.612605 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-njxhk" Apr 17 20:43:55.669038 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.669017 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:43:55.677420 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-script-lib\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.677513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-modprobe-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677461 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.677600 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-modprobe-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677600 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677568 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-systemd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.677662 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.677662 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-var-lib-kubelet\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677721 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-var-lib-kubelet\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677721 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677663 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.677787 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.677787 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-log-socket\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.677787 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-host\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677926 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ad25b90-ed3e-4976-b701-b30fbe6881cd-hosts-file\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.677926 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-lib-modules\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.677926 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-tuned\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.678068 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677926 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-lib-modules\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.678068 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.677968 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-host\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.678068 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0ad25b90-ed3e-4976-b701-b30fbe6881cd-hosts-file\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.678200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gth9k\" (UniqueName: \"kubernetes.io/projected/b84b134c-9465-48d2-b811-36203ae88de2-kube-api-access-gth9k\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.678200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47702967-5f03-40a5-b1ae-9f6930a86290-host-slash\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.678200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.678200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-konnectivity-ca\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.678200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-os-release\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hlsr\" (UniqueName: \"kubernetes.io/projected/2aad12b0-2520-4cf5-bc30-a332be05db03-kube-api-access-8hlsr\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-registration-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678237 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678261 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-var-lib-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678335 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47702967-5f03-40a5-b1ae-9f6930a86290-host-slash\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678379 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-os-release\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678384 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-os-release\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678418 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-netns\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-os-release\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-conf-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-binary-copy\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-netns\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-daemon-config\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-conf-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ad25b90-ed3e-4976-b701-b30fbe6881cd-tmp-dir\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-netd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-config\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678640 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-env-overrides\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cni-binary-copy\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678723 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-multus-certs\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-etc-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-hostroot\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lqpw\" (UniqueName: \"kubernetes.io/projected/0ad25b90-ed3e-4976-b701-b30fbe6881cd-kube-api-access-9lqpw\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vdrs\" (UniqueName: \"kubernetes.io/projected/ecf62ce9-60d6-401a-a38e-898d286d58d4-kube-api-access-2vdrs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678921 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-systemd-units\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-systemd\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-registration-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk5sc\" (UniqueName: \"kubernetes.io/projected/95bd04a6-fb3b-498b-bf3e-7b047bad740d-kube-api-access-tk5sc\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.678999 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ad25b90-ed3e-4976-b701-b30fbe6881cd-tmp-dir\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-system-cni-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-hostroot\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-cnibin\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-kubelet\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-multus-certs\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679131 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-netns\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679151 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-systemd\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679158 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-ovn\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-system-cni-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-daemon-config\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.679707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysconfig\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-cnibin\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-bin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679268 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysconfig\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-etc-kubernetes\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679298 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cni-binary-copy\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-etc-kubernetes\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679383 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-bin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tlt\" (UniqueName: \"kubernetes.io/projected/47702967-5f03-40a5-b1ae-9f6930a86290-kube-api-access-l2tlt\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.679621 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.679990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjs74\" (UniqueName: \"kubernetes.io/projected/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-kube-api-access-vjs74\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-run\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ae2eb0-2562-4952-a3e2-66786045ebd7-host\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4rl\" (UniqueName: \"kubernetes.io/projected/71ae2eb0-2562-4952-a3e2-66786045ebd7-kube-api-access-zr4rl\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.680430 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-run\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ae2eb0-2562-4952-a3e2-66786045ebd7-host\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.680665 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:56.180630477 +0000 UTC m=+1.979508629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-agent-certs\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-multus\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680845 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-cni-multus\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680896 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680931 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-node-log\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-bin\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.680987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovn-node-metrics-cert\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-socket-dir-parent\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-sys\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-conf\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681165 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-tmp\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681249 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-socket-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681288 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-tuned\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681311 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681323 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-sys\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-conf\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-multus-socket-dir-parent\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681549 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-sysctl-d\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-k8s-cni-cncf-io\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681611 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-slash\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-socket-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-kubernetes\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6bv4\" (UniqueName: \"kubernetes.io/projected/bfeb0b83-e72b-4e99-a672-fe8226b4c276-kube-api-access-b6bv4\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfeb0b83-e72b-4e99-a672-fe8226b4c276-etc-kubernetes\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681773 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-run-k8s-cni-cncf-io\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.681789 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cnibin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-kubelet\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-cnibin\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.681861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-device-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682112 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-sys-fs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71ae2eb0-2562-4952-a3e2-66786045ebd7-serviceca\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-system-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/47702967-5f03-40a5-b1ae-9f6930a86290-iptables-alerter-script\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.682477 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aad12b0-2520-4cf5-bc30-a332be05db03-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682489 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-sys-fs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-host-var-lib-kubelet\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ecf62ce9-60d6-401a-a38e-898d286d58d4-device-dir\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682631 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95bd04a6-fb3b-498b-bf3e-7b047bad740d-system-cni-dir\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aad12b0-2520-4cf5-bc30-a332be05db03-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.682870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/47702967-5f03-40a5-b1ae-9f6930a86290-iptables-alerter-script\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.683127 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.682929 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/71ae2eb0-2562-4952-a3e2-66786045ebd7-serviceca\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.684113 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.683767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfeb0b83-e72b-4e99-a672-fe8226b4c276-tmp\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.687785 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.687487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tlt\" (UniqueName: \"kubernetes.io/projected/47702967-5f03-40a5-b1ae-9f6930a86290-kube-api-access-l2tlt\") pod \"iptables-alerter-knzfb\" (UID: \"47702967-5f03-40a5-b1ae-9f6930a86290\") " pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.687785 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.687578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hlsr\" (UniqueName: \"kubernetes.io/projected/2aad12b0-2520-4cf5-bc30-a332be05db03-kube-api-access-8hlsr\") pod \"multus-additional-cni-plugins-tz6kr\" (UID: \"2aad12b0-2520-4cf5-bc30-a332be05db03\") " pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.687785 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.687686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vdrs\" (UniqueName: \"kubernetes.io/projected/ecf62ce9-60d6-401a-a38e-898d286d58d4-kube-api-access-2vdrs\") pod \"aws-ebs-csi-driver-node-vwgxp\" (UID: \"ecf62ce9-60d6-401a-a38e-898d286d58d4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.687785 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.687730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gth9k\" (UniqueName: \"kubernetes.io/projected/b84b134c-9465-48d2-b811-36203ae88de2-kube-api-access-gth9k\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:55.687955 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.687792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lqpw\" (UniqueName: \"kubernetes.io/projected/0ad25b90-ed3e-4976-b701-b30fbe6881cd-kube-api-access-9lqpw\") pod \"node-resolver-bcf4q\" (UID: \"0ad25b90-ed3e-4976-b701-b30fbe6881cd\") " pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.689958 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.689940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4rl\" (UniqueName: \"kubernetes.io/projected/71ae2eb0-2562-4952-a3e2-66786045ebd7-kube-api-access-zr4rl\") pod \"node-ca-fmmlt\" (UID: \"71ae2eb0-2562-4952-a3e2-66786045ebd7\") " pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.690078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.690062 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk5sc\" (UniqueName: \"kubernetes.io/projected/95bd04a6-fb3b-498b-bf3e-7b047bad740d-kube-api-access-tk5sc\") pod \"multus-2bn4l\" (UID: \"95bd04a6-fb3b-498b-bf3e-7b047bad740d\") " pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.690530 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.690505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6bv4\" (UniqueName: \"kubernetes.io/projected/bfeb0b83-e72b-4e99-a672-fe8226b4c276-kube-api-access-b6bv4\") pod \"tuned-q4xbt\" (UID: \"bfeb0b83-e72b-4e99-a672-fe8226b4c276\") " pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.694438 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.694422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" Apr 17 20:43:55.701901 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.701886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcf4q" Apr 17 20:43:55.780613 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.780568 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a540b23d926990aa9314e724ab32d20.slice/crio-65bb01071f0179c776022a4231905f701b6a523b2fffd990ad1b6c041719b67b WatchSource:0}: Error finding container 65bb01071f0179c776022a4231905f701b6a523b2fffd990ad1b6c041719b67b: Status 404 returned error can't find the container with id 65bb01071f0179c776022a4231905f701b6a523b2fffd990ad1b6c041719b67b Apr 17 20:43:55.781879 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.781853 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode250693f0814c5dff374e113e490f4a6.slice/crio-d7346b4ccc517bccdb8e43add2cf5f1dcd778e145126d5c52daa0779ab74a487 WatchSource:0}: Error finding container d7346b4ccc517bccdb8e43add2cf5f1dcd778e145126d5c52daa0779ab74a487: Status 404 returned error can't find the container with id d7346b4ccc517bccdb8e43add2cf5f1dcd778e145126d5c52daa0779ab74a487 Apr 17 20:43:55.783121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-netd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783064 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-config\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783087 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-env-overrides\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783267 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783126 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-etc-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783267 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-netd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783267 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-systemd-units\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783267 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-etc-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783267 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783242 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-systemd-units\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783321 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-kubelet\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-netns\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-ovn\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-kubelet\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjs74\" (UniqueName: \"kubernetes.io/projected/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-kube-api-access-vjs74\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783474 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783496 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-netns\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-agent-certs\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-node-log\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.783570 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-bin\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovn-node-metrics-cert\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783631 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-ovn\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-slash\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-run-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-env-overrides\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-script-lib\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-cni-bin\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-systemd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-node-log\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783804 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783829 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-log-socket\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-konnectivity-ca\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-var-lib-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-slash\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.783989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-var-lib-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-openvswitch\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784035 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-log-socket\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784062 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-run-systemd\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784121 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-script-lib\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovnkube-config\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.784801 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.784614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-konnectivity-ca\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.786595 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.786538 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-ovn-node-metrics-cert\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.787039 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.787018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1f54a916-b3e7-4361-b0c6-0ec7db5c31e6-agent-certs\") pod \"konnectivity-agent-rjk22\" (UID: \"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6\") " pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:55.787555 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.787540 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:43:55.789822 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.789807 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:43:55.789878 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.789825 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:43:55.789878 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.789835 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:55.789949 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:55.789879 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:56.28986575 +0000 UTC m=+2.088743897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:55.792508 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.792492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjs74\" (UniqueName: \"kubernetes.io/projected/a193cfd6-995e-4072-a6e1-26f3f8ca3a85-kube-api-access-vjs74\") pod \"ovnkube-node-fdrzh\" (UID: \"a193cfd6-995e-4072-a6e1-26f3f8ca3a85\") " pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:55.820373 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.820348 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:43:55.887237 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.887154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" Apr 17 20:43:55.892845 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.892823 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfeb0b83_e72b_4e99_a672_fe8226b4c276.slice/crio-8c5136cf2d99639d63c780dd98fecea21f7362fd5d2793a094cfeb751db5fe92 WatchSource:0}: Error finding container 8c5136cf2d99639d63c780dd98fecea21f7362fd5d2793a094cfeb751db5fe92: Status 404 returned error can't find the container with id 8c5136cf2d99639d63c780dd98fecea21f7362fd5d2793a094cfeb751db5fe92 Apr 17 20:43:55.920271 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.920250 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fmmlt" Apr 17 20:43:55.925440 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.925421 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ae2eb0_2562_4952_a3e2_66786045ebd7.slice/crio-efcafc34954da8587612cad144311f062978caa3ccda71632870401ffe5e6a3a WatchSource:0}: Error finding container efcafc34954da8587612cad144311f062978caa3ccda71632870401ffe5e6a3a: Status 404 returned error can't find the container with id efcafc34954da8587612cad144311f062978caa3ccda71632870401ffe5e6a3a Apr 17 20:43:55.936790 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.936770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2bn4l" Apr 17 20:43:55.942539 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.942521 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bd04a6_fb3b_498b_bf3e_7b047bad740d.slice/crio-4f55384777c5d773072b40e8ca1e5b349a304190a71a403e63c319d7fa4d9d01 WatchSource:0}: Error finding container 4f55384777c5d773072b40e8ca1e5b349a304190a71a403e63c319d7fa4d9d01: Status 404 returned error can't find the container with id 4f55384777c5d773072b40e8ca1e5b349a304190a71a403e63c319d7fa4d9d01 Apr 17 20:43:55.948101 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.948085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" Apr 17 20:43:55.953094 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.953076 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aad12b0_2520_4cf5_bc30_a332be05db03.slice/crio-896bc02510dfc52c9b7353d028c6f9e40e89efb9562d5c95a9266ef04f5c0b17 WatchSource:0}: Error finding container 896bc02510dfc52c9b7353d028c6f9e40e89efb9562d5c95a9266ef04f5c0b17: Status 404 returned error can't find the container with id 896bc02510dfc52c9b7353d028c6f9e40e89efb9562d5c95a9266ef04f5c0b17 Apr 17 20:43:55.965810 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:55.965794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-knzfb" Apr 17 20:43:55.971246 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:55.971226 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47702967_5f03_40a5_b1ae_9f6930a86290.slice/crio-36653c4433ec6a215b20ce560a26eca3c6f01ed0b34940700d5fe9ba02595bdf WatchSource:0}: Error finding container 36653c4433ec6a215b20ce560a26eca3c6f01ed0b34940700d5fe9ba02595bdf: Status 404 returned error can't find the container with id 36653c4433ec6a215b20ce560a26eca3c6f01ed0b34940700d5fe9ba02595bdf Apr 17 20:43:56.006491 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:56.006473 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf62ce9_60d6_401a_a38e_898d286d58d4.slice/crio-2ccc3f7559ed3fac83ec3c14b2f600eaef8feb2d9eb1e8c9e4e5f01661780404 WatchSource:0}: Error finding container 2ccc3f7559ed3fac83ec3c14b2f600eaef8feb2d9eb1e8c9e4e5f01661780404: Status 404 returned error can't find the container with id 2ccc3f7559ed3fac83ec3c14b2f600eaef8feb2d9eb1e8c9e4e5f01661780404 Apr 17 20:43:56.023136 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.023119 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:43:56.027682 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.027667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:43:56.032975 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:56.032949 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad25b90_ed3e_4976_b701_b30fbe6881cd.slice/crio-67975ff4cee7d7ebdc39ac716c99e41bcba58e622db8b6e491c17a823650aa89 WatchSource:0}: Error finding container 67975ff4cee7d7ebdc39ac716c99e41bcba58e622db8b6e491c17a823650aa89: Status 404 returned error can't find the container with id 67975ff4cee7d7ebdc39ac716c99e41bcba58e622db8b6e491c17a823650aa89 Apr 17 20:43:56.034987 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:56.034968 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda193cfd6_995e_4072_a6e1_26f3f8ca3a85.slice/crio-ce1309f8e520d91d90af139e1fa667184a536b65ec633473e38f393ed5f057ba WatchSource:0}: Error finding container ce1309f8e520d91d90af139e1fa667184a536b65ec633473e38f393ed5f057ba: Status 404 returned error can't find the container with id ce1309f8e520d91d90af139e1fa667184a536b65ec633473e38f393ed5f057ba Apr 17 20:43:56.035526 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:43:56.035466 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f54a916_b3e7_4361_b0c6_0ec7db5c31e6.slice/crio-36dbc0c2b5cf0d067522e10ece4984ba325771213afeb9ddf039a935d5f5e9fc WatchSource:0}: Error finding container 36dbc0c2b5cf0d067522e10ece4984ba325771213afeb9ddf039a935d5f5e9fc: Status 404 returned error can't find the container with id 36dbc0c2b5cf0d067522e10ece4984ba325771213afeb9ddf039a935d5f5e9fc Apr 17 20:43:56.165750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.165701 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:43:56.187160 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.187131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:56.187274 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.187256 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:56.187333 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.187324 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:57.18730204 +0000 UTC m=+2.986180185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:56.389571 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.389538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:56.389720 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.389690 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:43:56.389720 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.389709 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:43:56.389720 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.389720 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:56.389893 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.389773 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:57.389755173 +0000 UTC m=+3.188633339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:56.613206 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.613116 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:38:55 +0000 UTC" deadline="2027-10-01 14:55:16.12552887 +0000 UTC" Apr 17 20:43:56.613206 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.613162 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12762h11m19.512370955s" Apr 17 20:43:56.710467 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.707768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:56.710467 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:56.707895 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:43:56.745395 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.745291 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-knzfb" event={"ID":"47702967-5f03-40a5-b1ae-9f6930a86290","Type":"ContainerStarted","Data":"36653c4433ec6a215b20ce560a26eca3c6f01ed0b34940700d5fe9ba02595bdf"} Apr 17 20:43:56.760604 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.760552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2bn4l" event={"ID":"95bd04a6-fb3b-498b-bf3e-7b047bad740d","Type":"ContainerStarted","Data":"4f55384777c5d773072b40e8ca1e5b349a304190a71a403e63c319d7fa4d9d01"} Apr 17 20:43:56.771304 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.771274 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fmmlt" event={"ID":"71ae2eb0-2562-4952-a3e2-66786045ebd7","Type":"ContainerStarted","Data":"efcafc34954da8587612cad144311f062978caa3ccda71632870401ffe5e6a3a"} Apr 17 20:43:56.783864 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.783838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" event={"ID":"4a540b23d926990aa9314e724ab32d20","Type":"ContainerStarted","Data":"65bb01071f0179c776022a4231905f701b6a523b2fffd990ad1b6c041719b67b"} Apr 17 20:43:56.793714 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.793664 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rjk22" event={"ID":"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6","Type":"ContainerStarted","Data":"36dbc0c2b5cf0d067522e10ece4984ba325771213afeb9ddf039a935d5f5e9fc"} Apr 17 20:43:56.807498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.807475 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"ce1309f8e520d91d90af139e1fa667184a536b65ec633473e38f393ed5f057ba"} Apr 17 20:43:56.811011 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.810984 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerStarted","Data":"896bc02510dfc52c9b7353d028c6f9e40e89efb9562d5c95a9266ef04f5c0b17"} Apr 17 20:43:56.819927 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.819907 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:43:56.841505 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.841473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" event={"ID":"bfeb0b83-e72b-4e99-a672-fe8226b4c276","Type":"ContainerStarted","Data":"8c5136cf2d99639d63c780dd98fecea21f7362fd5d2793a094cfeb751db5fe92"} Apr 17 20:43:56.847024 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.847001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" event={"ID":"e250693f0814c5dff374e113e490f4a6","Type":"ContainerStarted","Data":"d7346b4ccc517bccdb8e43add2cf5f1dcd778e145126d5c52daa0779ab74a487"} Apr 17 20:43:56.872506 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.872412 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcf4q" event={"ID":"0ad25b90-ed3e-4976-b701-b30fbe6881cd","Type":"ContainerStarted","Data":"67975ff4cee7d7ebdc39ac716c99e41bcba58e622db8b6e491c17a823650aa89"} Apr 17 20:43:56.882914 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:56.882497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" event={"ID":"ecf62ce9-60d6-401a-a38e-898d286d58d4","Type":"ContainerStarted","Data":"2ccc3f7559ed3fac83ec3c14b2f600eaef8feb2d9eb1e8c9e4e5f01661780404"} Apr 17 20:43:57.196218 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:57.196137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:57.196384 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.196299 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:57.196384 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.196363 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:59.196345236 +0000 UTC m=+4.995223384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:57.397933 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:57.397863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:57.398116 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.398062 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:43:57.398116 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.398087 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:43:57.398116 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.398101 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:57.398282 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.398173 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:43:59.398153044 +0000 UTC m=+5.197031203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:57.614140 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:57.614058 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:38:55 +0000 UTC" deadline="2027-10-23 05:47:04.767504403 +0000 UTC" Apr 17 20:43:57.614140 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:57.614092 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13281h3m7.153415662s" Apr 17 20:43:57.708172 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:57.708125 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:57.708321 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:57.708266 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:43:58.713773 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:58.713748 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:58.714222 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:58.713866 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:43:59.215975 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:59.215917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:59.216195 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.216169 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:59.216288 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.216235 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:03.216216505 +0000 UTC m=+9.015094650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:43:59.417738 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:59.417677 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:43:59.417883 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.417820 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:43:59.417883 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.417842 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:43:59.417883 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.417857 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:59.418043 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.417915 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:03.417897323 +0000 UTC m=+9.216775468 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:43:59.708331 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:43:59.708257 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:43:59.708535 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:43:59.708423 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:00.708378 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:00.708339 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:00.708790 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:00.708503 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:01.707719 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:01.707686 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:01.707877 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:01.707843 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:02.708256 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:02.708216 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:02.708751 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:02.708341 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:03.252829 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:03.252760 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:03.253019 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.252890 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:03.253019 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.252957 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:11.252937746 +0000 UTC m=+17.051815904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:03.454078 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:03.454036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:03.454248 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.454196 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:03.454248 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.454218 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:03.454248 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.454230 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:03.454423 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.454292 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:11.454274563 +0000 UTC m=+17.253152709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:03.708598 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:03.708519 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:03.709107 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:03.708675 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:04.708757 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:04.708309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:04.708757 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:04.708415 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:05.708282 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:05.708250 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:05.708481 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:05.708376 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:06.712215 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:06.712189 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:06.712670 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:06.712283 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:07.708392 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:07.708357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:07.708587 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:07.708506 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:08.708379 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:08.708345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:08.708871 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:08.708465 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:08.952864 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:08.952833 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4qt6s"] Apr 17 20:44:08.986769 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:08.986692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:08.986906 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:08.986772 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:09.094102 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.094072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-dbus\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.094289 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.094110 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-kubelet-config\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.094289 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.094152 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.194865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.194833 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-dbus\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.194865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.194879 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-kubelet-config\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.195094 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.194929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.195094 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.194995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-dbus\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.195094 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.195039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-kubelet-config\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.195094 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:09.195079 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:09.195268 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:09.195131 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:09.69511463 +0000 UTC m=+15.493992791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:09.698694 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.698664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:09.698811 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:09.698798 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:09.698881 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:09.698858 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:10.698839034 +0000 UTC m=+16.497717189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:09.708424 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:09.708405 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:09.708714 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:09.708515 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:10.705577 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:10.705546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:10.705771 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:10.705656 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:10.705771 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:10.705718 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:12.705698785 +0000 UTC m=+18.504576966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:10.707610 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:10.707579 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:10.707727 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:10.707688 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:10.707727 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:10.707723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:10.707812 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:10.707792 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:11.310718 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:11.310681 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:11.311174 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.310849 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:11.311174 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.310925 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:27.310904065 +0000 UTC m=+33.109782214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:11.511598 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:11.511564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:11.511754 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.511737 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:11.511792 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.511760 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:11.511792 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.511773 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:11.511851 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.511831 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:27.511813389 +0000 UTC m=+33.310691545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:11.708204 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:11.708128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:11.708365 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:11.708270 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:12.707702 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:12.707661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:12.708141 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:12.707780 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:12.708215 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:12.708153 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:12.708293 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:12.708269 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:12.721044 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:12.721013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:12.721173 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:12.721156 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:12.721216 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:12.721210 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:16.721195339 +0000 UTC m=+22.520073483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:13.708099 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:13.708069 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:13.708555 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:13.708209 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:14.708890 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.708714 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:14.709628 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.708795 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:14.709628 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:14.708976 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:14.709628 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:14.709067 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:14.924587 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.924257 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2bn4l" event={"ID":"95bd04a6-fb3b-498b-bf3e-7b047bad740d","Type":"ContainerStarted","Data":"3dbf73dfef9e862c0f884a9dd4d3d667c7131fe0221a91a8088b4dbcbf4733b2"} Apr 17 20:44:14.927131 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.927105 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"bc9d9309f350c04214a80f8acf691af1c8954464666447d5db632d3417b685e8"} Apr 17 20:44:14.927244 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.927140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"c908edf8290fa82f0ca2434d69fdcf8545bd00a5984f3d3b8d6a07b6c653352a"} Apr 17 20:44:14.927244 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.927155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"48b18f887e1a047f38e5e8742bab257abc94b85249a2729639559acd0ad8017e"} Apr 17 20:44:14.929391 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.929279 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" event={"ID":"bfeb0b83-e72b-4e99-a672-fe8226b4c276","Type":"ContainerStarted","Data":"d592f4eb8e4772e4bc3d295967d8d96f9af18e169e41c727da56433501778d55"} Apr 17 20:44:14.931367 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.931329 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" event={"ID":"e250693f0814c5dff374e113e490f4a6","Type":"ContainerStarted","Data":"1cddeb7f955dc3fdd2dea5d30901fd2f654abc1278d95105d3ca4b8d30de4555"} Apr 17 20:44:14.931625 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.931605 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" Apr 17 20:44:14.936905 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.936865 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2bn4l" podStartSLOduration=2.816586599 podStartE2EDuration="20.936853544s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:55.944037543 +0000 UTC m=+1.742915691" lastFinishedPulling="2026-04-17 20:44:14.064304481 +0000 UTC m=+19.863182636" observedRunningTime="2026-04-17 20:44:14.936484185 +0000 UTC m=+20.735362352" watchObservedRunningTime="2026-04-17 20:44:14.936853544 +0000 UTC m=+20.735731709" Apr 17 20:44:14.939627 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.939553 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:44:14.939719 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.939644 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal"] Apr 17 20:44:14.949586 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:14.949515 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-q4xbt" podStartSLOduration=2.808387785 podStartE2EDuration="20.94950047s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:55.894254786 +0000 UTC m=+1.693132933" lastFinishedPulling="2026-04-17 20:44:14.03536746 +0000 UTC m=+19.834245618" observedRunningTime="2026-04-17 20:44:14.948822444 +0000 UTC m=+20.747700610" watchObservedRunningTime="2026-04-17 20:44:14.94950047 +0000 UTC m=+20.748378638" Apr 17 20:44:15.708253 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.708041 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:15.708378 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:15.708287 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:15.933876 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.933843 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcf4q" event={"ID":"0ad25b90-ed3e-4976-b701-b30fbe6881cd","Type":"ContainerStarted","Data":"d36406c29fc147c13d7d7f036dfadc7827e46c512f141fe1cfe24d7c425acba5"} Apr 17 20:44:15.935123 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.935100 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" event={"ID":"ecf62ce9-60d6-401a-a38e-898d286d58d4","Type":"ContainerStarted","Data":"299a059a2bdf048c7442ac6a0b274d0d75bc4867d2deccd6cfff8053e067bd0f"} Apr 17 20:44:15.936280 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.936260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-knzfb" event={"ID":"47702967-5f03-40a5-b1ae-9f6930a86290","Type":"ContainerStarted","Data":"2a261365d99912d81cf71a7dbf52b9e522d344e1d06d2cd57d99c7de26281d35"} Apr 17 20:44:15.937499 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.937478 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fmmlt" event={"ID":"71ae2eb0-2562-4952-a3e2-66786045ebd7","Type":"ContainerStarted","Data":"824d2ba98c3319503ab2f62ef08b76722d716edae2e4ee35676d3761179efe5e"} Apr 17 20:44:15.938746 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.938726 2567 generic.go:358] "Generic (PLEG): container finished" podID="4a540b23d926990aa9314e724ab32d20" containerID="ad3b2c8dd94e10ffd6af91e65654ea1506efc66440c295191f78b55f92408133" exitCode=0 Apr 17 20:44:15.938856 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.938795 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" event={"ID":"4a540b23d926990aa9314e724ab32d20","Type":"ContainerDied","Data":"ad3b2c8dd94e10ffd6af91e65654ea1506efc66440c295191f78b55f92408133"} Apr 17 20:44:15.939994 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.939966 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rjk22" event={"ID":"1f54a916-b3e7-4361-b0c6-0ec7db5c31e6","Type":"ContainerStarted","Data":"b4bc91f7b1017208447cfef3465a057d731eddd9d81b8406d255a7d605c095d5"} Apr 17 20:44:15.942285 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.942268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"557185358db7ff99a5a6d7e9fcd9cec250da3c8a7ff31dd29537741638a14911"} Apr 17 20:44:15.942361 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.942290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"2fe51995de797c0cd1ed839524aac016084019c39bf23584180dd886e363171c"} Apr 17 20:44:15.942361 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.942304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"39453efaf2979b8b81891d49c9c81481845c067e2a517e3542851dd1a9de7623"} Apr 17 20:44:15.943437 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.943417 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="c4a189cd91c997908263694502efe2104975074ef89b986f3cf16ebb0d152d1b" exitCode=0 Apr 17 20:44:15.943507 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.943491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"c4a189cd91c997908263694502efe2104975074ef89b986f3cf16ebb0d152d1b"} Apr 17 20:44:15.944983 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.944939 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-255.ec2.internal" podStartSLOduration=1.944927575 podStartE2EDuration="1.944927575s" podCreationTimestamp="2026-04-17 20:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:44:15.944173315 +0000 UTC m=+21.743051480" watchObservedRunningTime="2026-04-17 20:44:15.944927575 +0000 UTC m=+21.743805741" Apr 17 20:44:15.958027 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.957997 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bcf4q" podStartSLOduration=3.9590799949999997 podStartE2EDuration="21.95798709s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:56.034488745 +0000 UTC m=+1.833366892" lastFinishedPulling="2026-04-17 20:44:14.033395837 +0000 UTC m=+19.832273987" observedRunningTime="2026-04-17 20:44:15.957908094 +0000 UTC m=+21.756786259" watchObservedRunningTime="2026-04-17 20:44:15.95798709 +0000 UTC m=+21.756865256" Apr 17 20:44:15.977326 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.977283 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-knzfb" podStartSLOduration=3.916686116 podStartE2EDuration="21.977272556s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:55.972620293 +0000 UTC m=+1.771498441" lastFinishedPulling="2026-04-17 20:44:14.03320654 +0000 UTC m=+19.832084881" observedRunningTime="2026-04-17 20:44:15.976837047 +0000 UTC m=+21.775715226" watchObservedRunningTime="2026-04-17 20:44:15.977272556 +0000 UTC m=+21.776150723" Apr 17 20:44:15.999003 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.998927 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rjk22" podStartSLOduration=3.029836562 podStartE2EDuration="20.99891385s" podCreationTimestamp="2026-04-17 20:43:55 +0000 UTC" firstStartedPulling="2026-04-17 20:43:56.036963876 +0000 UTC m=+1.835842019" lastFinishedPulling="2026-04-17 20:44:14.00604115 +0000 UTC m=+19.804919307" observedRunningTime="2026-04-17 20:44:15.98876634 +0000 UTC m=+21.787644506" watchObservedRunningTime="2026-04-17 20:44:15.99891385 +0000 UTC m=+21.797792016" Apr 17 20:44:15.999329 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:15.999294 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fmmlt" podStartSLOduration=3.920138985 podStartE2EDuration="21.999284523s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:55.926924617 +0000 UTC m=+1.725802764" lastFinishedPulling="2026-04-17 20:44:14.006070143 +0000 UTC m=+19.804948302" observedRunningTime="2026-04-17 20:44:15.998529419 +0000 UTC m=+21.797407588" watchObservedRunningTime="2026-04-17 20:44:15.999284523 +0000 UTC m=+21.798162690" Apr 17 20:44:16.707922 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.707894 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:16.708022 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.707940 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:16.708063 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:16.708019 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:16.708148 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:16.708130 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:16.750428 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.750402 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:16.750585 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:16.750554 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:16.750639 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:16.750619 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:24.750602553 +0000 UTC m=+30.549480696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:16.779298 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.779271 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:44:16.947392 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.947311 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" event={"ID":"ecf62ce9-60d6-401a-a38e-898d286d58d4","Type":"ContainerStarted","Data":"d36b0f7d02cc7b5bad1587cb147edc0b167d0e063f54128589b5c90108e7d31a"} Apr 17 20:44:16.949134 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.949108 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" event={"ID":"4a540b23d926990aa9314e724ab32d20","Type":"ContainerStarted","Data":"3dd18fbfc441761a8b68e3bd4161eb01f90b7dbbe838d17adc4cd02ee0a0fff9"} Apr 17 20:44:16.972466 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:16.972409 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-255.ec2.internal" podStartSLOduration=21.972397371 podStartE2EDuration="21.972397371s" podCreationTimestamp="2026-04-17 20:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:44:16.971950987 +0000 UTC m=+22.770829153" watchObservedRunningTime="2026-04-17 20:44:16.972397371 +0000 UTC m=+22.771275537" Apr 17 20:44:17.645265 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:17.644972 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:44:16.779291436Z","UUID":"5c8152c1-974b-4f24-b93f-5cbf6ff908bc","Handler":null,"Name":"","Endpoint":""} Apr 17 20:44:17.646710 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:17.646681 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:44:17.646710 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:17.646715 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:44:17.708067 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:17.708041 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:17.708209 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:17.708169 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:17.952942 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:17.952870 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"94da9a4382b7010e05e4a92d8754e75d3789e396b38db95b8b42ece52b55c8de"} Apr 17 20:44:18.708196 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:18.708164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:18.708379 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:18.708164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:18.708379 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:18.708318 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:18.708379 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:18.708337 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:18.956649 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:18.956610 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" event={"ID":"ecf62ce9-60d6-401a-a38e-898d286d58d4","Type":"ContainerStarted","Data":"9c9ba5e049eb9b46e07a458c66140df9f07eaa54a98b36e63cd447da1319f4e3"} Apr 17 20:44:18.974486 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:18.974392 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vwgxp" podStartSLOduration=2.899056798 podStartE2EDuration="24.974379288s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:56.007887446 +0000 UTC m=+1.806765592" lastFinishedPulling="2026-04-17 20:44:18.083209931 +0000 UTC m=+23.882088082" observedRunningTime="2026-04-17 20:44:18.973882828 +0000 UTC m=+24.772760997" watchObservedRunningTime="2026-04-17 20:44:18.974379288 +0000 UTC m=+24.773257454" Apr 17 20:44:19.413933 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:19.413724 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:44:19.415045 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:19.414563 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:44:19.708064 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:19.708030 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:19.708232 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:19.708150 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:19.962513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:19.962389 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" event={"ID":"a193cfd6-995e-4072-a6e1-26f3f8ca3a85","Type":"ContainerStarted","Data":"629607e699491b0b80861fb233e62fd7006ea68274161b5b7aa4cee744762a10"} Apr 17 20:44:19.987299 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:19.987249 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" podStartSLOduration=6.319896658 podStartE2EDuration="24.987231464s" podCreationTimestamp="2026-04-17 20:43:55 +0000 UTC" firstStartedPulling="2026-04-17 20:43:56.03652643 +0000 UTC m=+1.835404574" lastFinishedPulling="2026-04-17 20:44:14.703861219 +0000 UTC m=+20.502739380" observedRunningTime="2026-04-17 20:44:19.986777945 +0000 UTC m=+25.785656111" watchObservedRunningTime="2026-04-17 20:44:19.987231464 +0000 UTC m=+25.786109631" Apr 17 20:44:20.707847 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.707813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:20.708059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.707851 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:20.708059 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:20.707943 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:20.708059 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:20.708042 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:20.965425 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.965340 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:20.965425 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.965387 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:20.965425 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.965401 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:20.980490 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.980434 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:20.980619 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:20.980573 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:21.285997 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.285954 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4qt6s"] Apr 17 20:44:21.286179 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.286128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:21.286276 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:21.286247 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:21.288869 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.288837 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mpmw8"] Apr 17 20:44:21.288962 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.288954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:21.289087 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:21.289049 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:21.289522 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.289497 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mxwcv"] Apr 17 20:44:21.289623 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.289606 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:21.289746 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:21.289714 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:21.967959 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.967931 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="7a78c757378be28e9efe413f966798a83dc04dff4b891c6bc46eef5927047cb4" exitCode=0 Apr 17 20:44:21.968409 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:21.968014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"7a78c757378be28e9efe413f966798a83dc04dff4b891c6bc46eef5927047cb4"} Apr 17 20:44:22.707964 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.707934 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:22.708121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.707936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:22.708121 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:22.708034 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:22.708207 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:22.708133 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:22.708207 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.707938 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:22.708322 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:22.708244 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:22.956736 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.956708 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:44:22.956900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.956814 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 20:44:22.957224 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:22.957210 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rjk22" Apr 17 20:44:23.973076 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:23.973044 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="c56406d687e45974605ab98b3f5ebf4786930fbe1903f18d8aaf5fe3f072d3f7" exitCode=0 Apr 17 20:44:23.973642 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:23.973086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"c56406d687e45974605ab98b3f5ebf4786930fbe1903f18d8aaf5fe3f072d3f7"} Apr 17 20:44:24.709001 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:24.708818 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:24.709173 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:24.708885 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:24.709173 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:24.709080 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:24.709173 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:24.708903 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:24.709326 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:24.709152 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:24.709326 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:24.709258 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:24.818007 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:24.817984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:24.818138 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:24.818097 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:24.818138 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:24.818139 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret podName:8eeefe2c-274e-4ea9-a2c7-594d5fd9126f nodeName:}" failed. No retries permitted until 2026-04-17 20:44:40.818124648 +0000 UTC m=+46.617002793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret") pod "global-pull-secret-syncer-4qt6s" (UID: "8eeefe2c-274e-4ea9-a2c7-594d5fd9126f") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:44:25.978432 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:25.978400 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="9f73f340791f92e89d8542faa03ac9a8cbc6a66aac1a33bda4aeca40085fc85b" exitCode=0 Apr 17 20:44:25.978814 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:25.978471 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"9f73f340791f92e89d8542faa03ac9a8cbc6a66aac1a33bda4aeca40085fc85b"} Apr 17 20:44:26.708335 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:26.708298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:26.708536 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:26.708298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:26.708536 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:26.708431 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4qt6s" podUID="8eeefe2c-274e-4ea9-a2c7-594d5fd9126f" Apr 17 20:44:26.708667 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:26.708529 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:44:26.708667 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:26.708298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:26.708667 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:26.708630 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mpmw8" podUID="549959be-8acc-4beb-914c-74b089e36128" Apr 17 20:44:27.338146 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.338113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:27.338524 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.338233 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:27.338524 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.338292 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.338277941 +0000 UTC m=+65.137156085 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:44:27.526373 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.526290 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-255.ec2.internal" event="NodeReady" Apr 17 20:44:27.526540 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.526422 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:44:27.540376 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.540345 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:27.540520 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.540500 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:44:27.540520 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.540518 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:44:27.540614 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.540526 2567 projected.go:194] Error preparing data for projected volume kube-api-access-7679j for pod openshift-network-diagnostics/network-check-target-mpmw8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:27.540614 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.540578 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j podName:549959be-8acc-4beb-914c-74b089e36128 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.540561397 +0000 UTC m=+65.339439546 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7679j" (UniqueName: "kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j") pod "network-check-target-mpmw8" (UID: "549959be-8acc-4beb-914c-74b089e36128") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:44:27.557362 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.557339 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d5594c5c6-nphhz"] Apr 17 20:44:27.572940 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.572915 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d5594c5c6-nphhz"] Apr 17 20:44:27.572940 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.572940 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nd22j"] Apr 17 20:44:27.573178 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.573049 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.575325 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.574893 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:44:27.575325 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.575047 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:44:27.575513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.575428 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6qbmr\"" Apr 17 20:44:27.583951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.583849 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:44:27.590812 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.590769 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh"] Apr 17 20:44:27.590920 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.590866 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:44:27.591021 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.591003 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.593031 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.592923 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-76z4h\"" Apr 17 20:44:27.593031 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.592962 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 20:44:27.593304 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.593287 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 20:44:27.608814 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.608790 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2"] Apr 17 20:44:27.608959 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.608939 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.611052 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.611034 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 20:44:27.611381 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.611143 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-42l8q\"" Apr 17 20:44:27.611381 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.611217 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 20:44:27.611381 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.611199 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 20:44:27.611381 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.611140 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 20:44:27.635647 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.635624 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz"] Apr 17 20:44:27.635766 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.635752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.637850 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.637831 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 20:44:27.657162 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.657110 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jrhbz"] Apr 17 20:44:27.657265 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.657238 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.659870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.659848 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 20:44:27.659971 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.659850 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 20:44:27.659971 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.659894 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 20:44:27.660099 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.659978 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 20:44:27.678945 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.678923 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh"] Apr 17 20:44:27.679048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.678952 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nd22j"] Apr 17 20:44:27.679048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.678968 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2"] Apr 17 20:44:27.679048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.678981 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz"] Apr 17 20:44:27.679048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.678994 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrhbz"] Apr 17 20:44:27.679048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.679028 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qf8m5"] Apr 17 20:44:27.679239 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.679059 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.680906 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.680883 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:44:27.681169 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.681153 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pn5wd\"" Apr 17 20:44:27.681264 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.681184 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:44:27.681264 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.681258 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:44:27.697220 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.697200 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qf8m5"] Apr 17 20:44:27.697348 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.697334 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.699273 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.699249 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:44:27.699369 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.699279 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:44:27.699581 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.699563 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8pvd4\"" Apr 17 20:44:27.741914 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.741893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f49e6d25-4883-4014-9729-d80699320182-tmp\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.742049 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.741930 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742049 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.741967 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742049 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f49e6d25-4883-4014-9729-d80699320182-klusterlet-config\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.742229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wkks\" (UniqueName: \"kubernetes.io/projected/da86b587-5c22-4e33-99b8-998971aa192e-kube-api-access-2wkks\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lw6b\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/97b4c177-e511-421a-a7da-585f78bf704b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfw45\" (UniqueName: \"kubernetes.io/projected/f49e6d25-4883-4014-9729-d80699320182-kube-api-access-sfw45\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldf6\" (UniqueName: \"kubernetes.io/projected/97b4c177-e511-421a-a7da-585f78bf704b-kube-api-access-5ldf6\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742404 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23b78edd-8569-4781-bf46-bc649a833595-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.742498 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742514 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.742948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/da86b587-5c22-4e33-99b8-998971aa192e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.742948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.742602 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.843736 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23b78edd-8569-4781-bf46-bc649a833595-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.843736 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.843951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843776 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.843951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.843951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.843951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f49e6d25-4883-4014-9729-d80699320182-tmp\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.843951 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843912 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.843937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f74vb\" (UniqueName: \"kubernetes.io/projected/e29bb1be-edc2-47b7-8269-a7ceb57323f1-kube-api-access-f74vb\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f49e6d25-4883-4014-9729-d80699320182-klusterlet-config\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdj8j\" (UniqueName: \"kubernetes.io/projected/d7a05ede-324e-4207-a7a9-c301663390b7-kube-api-access-rdj8j\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wkks\" (UniqueName: \"kubernetes.io/projected/da86b587-5c22-4e33-99b8-998971aa192e-kube-api-access-2wkks\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.844161 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lw6b\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/97b4c177-e511-421a-a7da-585f78bf704b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldf6\" (UniqueName: \"kubernetes.io/projected/97b4c177-e511-421a-a7da-585f78bf704b-kube-api-access-5ldf6\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/da86b587-5c22-4e33-99b8-998971aa192e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e29bb1be-edc2-47b7-8269-a7ceb57323f1-tmp-dir\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.844497 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/23b78edd-8569-4781-bf46-bc649a833595-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:27.845017 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.845017 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfw45\" (UniqueName: \"kubernetes.io/projected/f49e6d25-4883-4014-9729-d80699320182-kube-api-access-sfw45\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.845017 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.844577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e29bb1be-edc2-47b7-8269-a7ceb57323f1-config-volume\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.845613 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.845174 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f49e6d25-4883-4014-9729-d80699320182-tmp\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.845613 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.845303 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/da86b587-5c22-4e33-99b8-998971aa192e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.845785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.846350 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.846414 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:28.346395829 +0000 UTC m=+34.145273987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.847721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.848407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.848508 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.848521 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:27.848573 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.848564 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:28.348548634 +0000 UTC m=+34.147426783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:27.850197 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.849464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.850197 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.849626 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f49e6d25-4883-4014-9729-d80699320182-klusterlet-config\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.850197 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.849850 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-ca\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.850197 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.850139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.851270 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.851245 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/da86b587-5c22-4e33-99b8-998971aa192e-hub\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.851367 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.851348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/97b4c177-e511-421a-a7da-585f78bf704b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.851884 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.851858 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.852891 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.852846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.853114 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.853089 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lw6b\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.854365 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.854339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldf6\" (UniqueName: \"kubernetes.io/projected/97b4c177-e511-421a-a7da-585f78bf704b-kube-api-access-5ldf6\") pod \"managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh\" (UID: \"97b4c177-e511-421a-a7da-585f78bf704b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.855059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.855029 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wkks\" (UniqueName: \"kubernetes.io/projected/da86b587-5c22-4e33-99b8-998971aa192e-kube-api-access-2wkks\") pod \"cluster-proxy-proxy-agent-7bddb9ffd8-zljvz\" (UID: \"da86b587-5c22-4e33-99b8-998971aa192e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:27.855241 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.855223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfw45\" (UniqueName: \"kubernetes.io/projected/f49e6d25-4883-4014-9729-d80699320182-kube-api-access-sfw45\") pod \"klusterlet-addon-workmgr-864fd9cd8-x9tr2\" (UID: \"f49e6d25-4883-4014-9729-d80699320182\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.866205 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.866186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:27.927607 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.927577 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:44:27.944402 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.944372 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:27.945120 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.944970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f74vb\" (UniqueName: \"kubernetes.io/projected/e29bb1be-edc2-47b7-8269-a7ceb57323f1-kube-api-access-f74vb\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.945120 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdj8j\" (UniqueName: \"kubernetes.io/projected/d7a05ede-324e-4207-a7a9-c301663390b7-kube-api-access-rdj8j\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.945243 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e29bb1be-edc2-47b7-8269-a7ceb57323f1-tmp-dir\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.945243 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e29bb1be-edc2-47b7-8269-a7ceb57323f1-config-volume\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.945243 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.945398 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.945398 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.945389 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:27.945519 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.945486 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:28.445433155 +0000 UTC m=+34.244311313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:27.945519 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.945491 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:27.945633 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e29bb1be-edc2-47b7-8269-a7ceb57323f1-tmp-dir\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.945633 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:27.945577 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:28.445558742 +0000 UTC m=+34.244436898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:27.945788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.945768 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e29bb1be-edc2-47b7-8269-a7ceb57323f1-config-volume\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.954123 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.954064 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f74vb\" (UniqueName: \"kubernetes.io/projected/e29bb1be-edc2-47b7-8269-a7ceb57323f1-kube-api-access-f74vb\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:27.954334 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.954310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdj8j\" (UniqueName: \"kubernetes.io/projected/d7a05ede-324e-4207-a7a9-c301663390b7-kube-api-access-rdj8j\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:27.967098 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:27.967072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:44:28.105324 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.105237 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh"] Apr 17 20:44:28.108297 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.108262 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2"] Apr 17 20:44:28.110478 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:44:28.110431 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97b4c177_e511_421a_a7da_585f78bf704b.slice/crio-ff962533a25d77f60b2a6b04bde87a4831c267db5f53f87d1a0e0b61589bf142 WatchSource:0}: Error finding container ff962533a25d77f60b2a6b04bde87a4831c267db5f53f87d1a0e0b61589bf142: Status 404 returned error can't find the container with id ff962533a25d77f60b2a6b04bde87a4831c267db5f53f87d1a0e0b61589bf142 Apr 17 20:44:28.111995 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:44:28.111932 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49e6d25_4883_4014_9729_d80699320182.slice/crio-b7759b455a92534908ded208cadb69c7c5afc0d8bb084437f12f7fed6e2bcd15 WatchSource:0}: Error finding container b7759b455a92534908ded208cadb69c7c5afc0d8bb084437f12f7fed6e2bcd15: Status 404 returned error can't find the container with id b7759b455a92534908ded208cadb69c7c5afc0d8bb084437f12f7fed6e2bcd15 Apr 17 20:44:28.118388 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.118348 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz"] Apr 17 20:44:28.128665 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:44:28.128642 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda86b587_5c22_4e33_99b8_998971aa192e.slice/crio-6a219fc96eb6c45280194a6dce66e63a12484b3283f21f6a9b6446c2b35a6a62 WatchSource:0}: Error finding container 6a219fc96eb6c45280194a6dce66e63a12484b3283f21f6a9b6446c2b35a6a62: Status 404 returned error can't find the container with id 6a219fc96eb6c45280194a6dce66e63a12484b3283f21f6a9b6446c2b35a6a62 Apr 17 20:44:28.348752 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.348713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:28.349212 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.348776 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:28.349212 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.348893 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:28.349212 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.348960 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:29.348947357 +0000 UTC m=+35.147825501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:28.349376 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.349320 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:28.349376 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.349337 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:28.349440 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.349386 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:29.349371184 +0000 UTC m=+35.148249331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:28.449275 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.449200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:28.449275 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.449250 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:28.449540 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.449392 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:28.449540 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.449471 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:29.449438632 +0000 UTC m=+35.248316800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:28.449540 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.449394 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:28.449540 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:28.449519 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:29.449509349 +0000 UTC m=+35.248387499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:28.708259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.708176 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:28.708407 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.708176 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:28.708643 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.708188 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:28.710365 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.710341 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:44:28.710782 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.710761 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:44:28.710782 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.710770 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb4cm\"" Apr 17 20:44:28.710936 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.710801 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jzn7k\"" Apr 17 20:44:28.710936 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.710816 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:44:28.711142 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.711091 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:44:28.987122 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.987024 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerStarted","Data":"6a219fc96eb6c45280194a6dce66e63a12484b3283f21f6a9b6446c2b35a6a62"} Apr 17 20:44:28.989646 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.989584 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" event={"ID":"f49e6d25-4883-4014-9729-d80699320182","Type":"ContainerStarted","Data":"b7759b455a92534908ded208cadb69c7c5afc0d8bb084437f12f7fed6e2bcd15"} Apr 17 20:44:28.991108 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:28.991062 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" event={"ID":"97b4c177-e511-421a-a7da-585f78bf704b","Type":"ContainerStarted","Data":"ff962533a25d77f60b2a6b04bde87a4831c267db5f53f87d1a0e0b61589bf142"} Apr 17 20:44:29.356933 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:29.356898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:29.357541 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:29.356960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:29.357541 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.357104 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:29.357541 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.357164 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:31.357144069 +0000 UTC m=+37.156022219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:29.357734 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.357578 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:29.357734 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.357595 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:29.357734 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.357646 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:31.357628154 +0000 UTC m=+37.156506315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:29.457784 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:29.457750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:29.457964 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:29.457798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:29.458025 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.458010 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:29.458097 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.458086 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:31.458068207 +0000 UTC m=+37.256946355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:29.458168 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.458156 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:29.458221 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:29.458191 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:31.458180304 +0000 UTC m=+37.257058455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:31.376169 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:31.375925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:31.376218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.376223 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.376302 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.376316 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.376367 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:35.376348164 +0000 UTC m=+41.175226313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:31.376608 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.376425 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:35.376391731 +0000 UTC m=+41.175269887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:31.476857 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:31.476816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:31.476857 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:31.476858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:31.477086 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.476986 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:31.477086 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.477058 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:31.477086 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.477077 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:35.477055283 +0000 UTC m=+41.275933449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:31.477249 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:31.477111 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:35.477094147 +0000 UTC m=+41.275972298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:35.411429 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:35.411392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:35.411473 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.411564 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.411588 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.411649 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.411661 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:43.411638458 +0000 UTC m=+49.210516606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:35.411984 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.411724 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:43.411706233 +0000 UTC m=+49.210584378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:35.512782 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:35.512755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:35.512782 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:35.512787 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:35.513020 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.512891 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:35.513020 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.512896 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:35.513020 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.512944 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:43.512927459 +0000 UTC m=+49.311805603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:35.513020 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:35.512961 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:43.51295366 +0000 UTC m=+49.311831803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:37.009680 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.009652 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="d9b9c16bcebdcbba77422a5c97f160cd1ab2573ba5422bd1708e829670e9fdb1" exitCode=0 Apr 17 20:44:37.010050 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.009731 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"d9b9c16bcebdcbba77422a5c97f160cd1ab2573ba5422bd1708e829670e9fdb1"} Apr 17 20:44:37.011069 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.011045 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerStarted","Data":"fe5092cb4f49949a863e0a561c515f0715441915ee9865318e525c5a2523b196"} Apr 17 20:44:37.012299 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.012234 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" event={"ID":"f49e6d25-4883-4014-9729-d80699320182","Type":"ContainerStarted","Data":"472abad9b0f528c329a50edb462f916b882d14695c1e3ced88b7d1df011b36ce"} Apr 17 20:44:37.012499 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.012480 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:37.013513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.013487 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" event={"ID":"97b4c177-e511-421a-a7da-585f78bf704b","Type":"ContainerStarted","Data":"105623c7f6661a2974a3a99e9cb6608ff70b02796f882791299bb13e49d1574e"} Apr 17 20:44:37.013995 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.013970 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:44:37.051772 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.051727 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" podStartSLOduration=31.470016143 podStartE2EDuration="40.051716958s" podCreationTimestamp="2026-04-17 20:43:57 +0000 UTC" firstStartedPulling="2026-04-17 20:44:28.114139111 +0000 UTC m=+33.913017255" lastFinishedPulling="2026-04-17 20:44:36.695839908 +0000 UTC m=+42.494718070" observedRunningTime="2026-04-17 20:44:37.051344704 +0000 UTC m=+42.850222870" watchObservedRunningTime="2026-04-17 20:44:37.051716958 +0000 UTC m=+42.850595102" Apr 17 20:44:37.067295 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:37.067216 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" podStartSLOduration=31.500316266 podStartE2EDuration="40.067201723s" podCreationTimestamp="2026-04-17 20:43:57 +0000 UTC" firstStartedPulling="2026-04-17 20:44:28.112556693 +0000 UTC m=+33.911434855" lastFinishedPulling="2026-04-17 20:44:36.679442153 +0000 UTC m=+42.478320312" observedRunningTime="2026-04-17 20:44:37.065593168 +0000 UTC m=+42.864471332" watchObservedRunningTime="2026-04-17 20:44:37.067201723 +0000 UTC m=+42.866079890" Apr 17 20:44:38.018964 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:38.018927 2567 generic.go:358] "Generic (PLEG): container finished" podID="2aad12b0-2520-4cf5-bc30-a332be05db03" containerID="b4b4392eaaeb95351b554b07e3a6e986a20c9ebe4ea1f3e4453aac383cacaca3" exitCode=0 Apr 17 20:44:38.019414 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:38.019012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerDied","Data":"b4b4392eaaeb95351b554b07e3a6e986a20c9ebe4ea1f3e4453aac383cacaca3"} Apr 17 20:44:39.023325 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:39.023293 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" event={"ID":"2aad12b0-2520-4cf5-bc30-a332be05db03","Type":"ContainerStarted","Data":"69ef799898ac8e8b7b57fd9507decd9d08ccc1bd936158d81dcf3298f6002d81"} Apr 17 20:44:39.044063 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:39.044010 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tz6kr" podStartSLOduration=4.318634896 podStartE2EDuration="45.0439937s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:43:55.954639903 +0000 UTC m=+1.753518048" lastFinishedPulling="2026-04-17 20:44:36.679998695 +0000 UTC m=+42.478876852" observedRunningTime="2026-04-17 20:44:39.041592862 +0000 UTC m=+44.840471029" watchObservedRunningTime="2026-04-17 20:44:39.0439937 +0000 UTC m=+44.842871870" Apr 17 20:44:40.027491 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:40.027434 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerStarted","Data":"74a17712172cc1bffcc2780f0d11b64eb2162a4227f830c1b730e00ab4f7bd4f"} Apr 17 20:44:40.027491 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:40.027479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerStarted","Data":"5a66c7c2bee52c1441089b087b7cab76d03a213e8028834193c3de4681332cae"} Apr 17 20:44:40.044788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:40.044735 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" podStartSLOduration=31.689398365 podStartE2EDuration="43.044722535s" podCreationTimestamp="2026-04-17 20:43:57 +0000 UTC" firstStartedPulling="2026-04-17 20:44:28.130622545 +0000 UTC m=+33.929500688" lastFinishedPulling="2026-04-17 20:44:39.485946701 +0000 UTC m=+45.284824858" observedRunningTime="2026-04-17 20:44:40.043907304 +0000 UTC m=+45.842785472" watchObservedRunningTime="2026-04-17 20:44:40.044722535 +0000 UTC m=+45.843600700" Apr 17 20:44:40.855100 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:40.855067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:40.872062 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:40.872030 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8eeefe2c-274e-4ea9-a2c7-594d5fd9126f-original-pull-secret\") pod \"global-pull-secret-syncer-4qt6s\" (UID: \"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f\") " pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:41.020818 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:41.020780 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qt6s" Apr 17 20:44:41.130713 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:41.130615 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4qt6s"] Apr 17 20:44:41.134367 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:44:41.134339 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eeefe2c_274e_4ea9_a2c7_594d5fd9126f.slice/crio-d822bfb8a818828f598bf56b5b1518de1e449d79b7e2a6fbea073ae267afc5af WatchSource:0}: Error finding container d822bfb8a818828f598bf56b5b1518de1e449d79b7e2a6fbea073ae267afc5af: Status 404 returned error can't find the container with id d822bfb8a818828f598bf56b5b1518de1e449d79b7e2a6fbea073ae267afc5af Apr 17 20:44:42.032401 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:42.032368 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4qt6s" event={"ID":"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f","Type":"ContainerStarted","Data":"d822bfb8a818828f598bf56b5b1518de1e449d79b7e2a6fbea073ae267afc5af"} Apr 17 20:44:43.471998 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:43.471956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:43.472020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.472118 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.472139 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.472164 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.472214 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.47219568 +0000 UTC m=+65.271073826 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:43.472443 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.472235 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.472225698 +0000 UTC m=+65.271103849 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:43.573083 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:43.573057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:43.573225 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:43.573088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:43.573225 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.573210 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:43.573426 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.573268 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.573249987 +0000 UTC m=+65.372128132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:43.573426 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.573215 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:43.573426 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:43.573354 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:44:59.573339906 +0000 UTC m=+65.372218055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:46.041138 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:46.041054 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4qt6s" event={"ID":"8eeefe2c-274e-4ea9-a2c7-594d5fd9126f","Type":"ContainerStarted","Data":"7a041c34b0ab3813a5b4f87cbaf794366794e4651d1962c7445e9caa577882dd"} Apr 17 20:44:46.053302 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:46.053128 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4qt6s" podStartSLOduration=33.414062225 podStartE2EDuration="38.053111427s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:44:41.136574001 +0000 UTC m=+46.935452146" lastFinishedPulling="2026-04-17 20:44:45.775623191 +0000 UTC m=+51.574501348" observedRunningTime="2026-04-17 20:44:46.052873618 +0000 UTC m=+51.851751786" watchObservedRunningTime="2026-04-17 20:44:46.053111427 +0000 UTC m=+51.851989594" Apr 17 20:44:52.983305 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:52.983277 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fdrzh" Apr 17 20:44:59.382872 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.382839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:44:59.385281 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.385260 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:44:59.393022 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.392994 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:44:59.393136 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.393049 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:03.393035145 +0000 UTC m=+129.191913289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : secret "metrics-daemon-secret" not found Apr 17 20:44:59.483510 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.483489 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:44:59.483624 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.483538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:44:59.483691 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.483626 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:44:59.483691 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.483644 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:44:59.483691 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.483689 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:31.483677282 +0000 UTC m=+97.282555426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:44:59.483830 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.483705 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:44:59.483830 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.483765 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:31.483748066 +0000 UTC m=+97.282626216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:44:59.584069 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.584047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:44:59.584173 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.584074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:44:59.584173 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.584100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:59.584278 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.584179 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:44:59.584278 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.584228 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:31.584214902 +0000 UTC m=+97.383093045 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:44:59.584374 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.584315 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:44:59.584374 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:44:59.584358 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:45:31.58434771 +0000 UTC m=+97.383225854 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:44:59.585718 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.585704 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:44:59.596059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.596041 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:44:59.607717 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.607698 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7679j\" (UniqueName: \"kubernetes.io/projected/549959be-8acc-4beb-914c-74b089e36128-kube-api-access-7679j\") pod \"network-check-target-mpmw8\" (UID: \"549959be-8acc-4beb-914c-74b089e36128\") " pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:59.631321 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.631301 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb4cm\"" Apr 17 20:44:59.640601 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.640560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:44:59.749516 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:44:59.749487 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mpmw8"] Apr 17 20:44:59.754508 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:44:59.754478 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549959be_8acc_4beb_914c_74b089e36128.slice/crio-db308af12c8c27da754e0509657484550a6bbe5e3a3c812d25d46d1b11ce3d4a WatchSource:0}: Error finding container db308af12c8c27da754e0509657484550a6bbe5e3a3c812d25d46d1b11ce3d4a: Status 404 returned error can't find the container with id db308af12c8c27da754e0509657484550a6bbe5e3a3c812d25d46d1b11ce3d4a Apr 17 20:45:00.075061 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:00.074984 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mpmw8" event={"ID":"549959be-8acc-4beb-914c-74b089e36128","Type":"ContainerStarted","Data":"db308af12c8c27da754e0509657484550a6bbe5e3a3c812d25d46d1b11ce3d4a"} Apr 17 20:45:03.083934 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:03.083898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mpmw8" event={"ID":"549959be-8acc-4beb-914c-74b089e36128","Type":"ContainerStarted","Data":"7056192cc78c261082a56a0655c540dbb857b320b079bbc518c89095702a1ce9"} Apr 17 20:45:03.084284 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:03.084018 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:45:03.100902 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:03.100859 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mpmw8" podStartSLOduration=65.482035639 podStartE2EDuration="1m8.100847379s" podCreationTimestamp="2026-04-17 20:43:55 +0000 UTC" firstStartedPulling="2026-04-17 20:44:59.756100154 +0000 UTC m=+65.554978312" lastFinishedPulling="2026-04-17 20:45:02.374911905 +0000 UTC m=+68.173790052" observedRunningTime="2026-04-17 20:45:03.100118395 +0000 UTC m=+68.898996561" watchObservedRunningTime="2026-04-17 20:45:03.100847379 +0000 UTC m=+68.899725536" Apr 17 20:45:31.523496 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:31.523434 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:31.523516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.523587 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.523608 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.523638 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.523666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.52364883 +0000 UTC m=+161.322526977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:45:31.523922 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.523690 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.523678777 +0000 UTC m=+161.322556922 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:45:31.624050 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:31.624026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:45:31.624130 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:31.624056 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:45:31.624169 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.624136 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:45:31.624169 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.624154 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:45:31.624229 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.624175 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.624164778 +0000 UTC m=+161.423042922 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:45:31.624229 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:45:31.624213 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:46:35.624200318 +0000 UTC m=+161.423078462 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:45:34.088806 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:45:34.088774 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mpmw8" Apr 17 20:46:03.450323 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:03.450274 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:46:03.450891 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:03.450419 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:46:03.450891 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:03.450502 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs podName:b84b134c-9465-48d2-b811-36203ae88de2 nodeName:}" failed. No retries permitted until 2026-04-17 20:48:05.450485262 +0000 UTC m=+251.249363407 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs") pod "network-metrics-daemon-mxwcv" (UID: "b84b134c-9465-48d2-b811-36203ae88de2") : secret "metrics-daemon-secret" not found Apr 17 20:46:27.461158 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:27.461128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bcf4q_0ad25b90-ed3e-4976-b701-b30fbe6881cd/dns-node-resolver/0.log" Apr 17 20:46:28.457592 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:28.457558 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fmmlt_71ae2eb0-2562-4952-a3e2-66786045ebd7/node-ca/0.log" Apr 17 20:46:30.585708 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:30.585666 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" podUID="cabe587e-db12-438a-a6c9-ccfe13aaaf19" Apr 17 20:46:30.602854 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:30.602820 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" podUID="23b78edd-8569-4781-bf46-bc649a833595" Apr 17 20:46:30.699367 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:30.699324 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jrhbz" podUID="d7a05ede-324e-4207-a7a9-c301663390b7" Apr 17 20:46:30.708645 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:30.708610 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qf8m5" podUID="e29bb1be-edc2-47b7-8269-a7ceb57323f1" Apr 17 20:46:31.284140 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:31.284064 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:46:31.284289 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:31.284064 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:46:31.735113 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:31.735073 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mxwcv" podUID="b84b134c-9465-48d2-b811-36203ae88de2" Apr 17 20:46:35.573552 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:35.573512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:35.573633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") pod \"image-registry-6d5594c5c6-nphhz\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.573655 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.573749 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.573760 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert podName:23b78edd-8569-4781-bf46-bc649a833595 nodeName:}" failed. No retries permitted until 2026-04-17 20:48:37.573743626 +0000 UTC m=+283.372621773 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nd22j" (UID: "23b78edd-8569-4781-bf46-bc649a833595") : secret "networking-console-plugin-cert" not found Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.573764 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d5594c5c6-nphhz: secret "image-registry-tls" not found Apr 17 20:46:35.574088 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.573825 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls podName:cabe587e-db12-438a-a6c9-ccfe13aaaf19 nodeName:}" failed. No retries permitted until 2026-04-17 20:48:37.573807 +0000 UTC m=+283.372685167 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls") pod "image-registry-6d5594c5c6-nphhz" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19") : secret "image-registry-tls" not found Apr 17 20:46:35.674247 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:35.674219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:46:35.674247 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:35.674253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:46:35.674503 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.674367 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:46:35.674503 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.674426 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert podName:d7a05ede-324e-4207-a7a9-c301663390b7 nodeName:}" failed. No retries permitted until 2026-04-17 20:48:37.674412811 +0000 UTC m=+283.473290959 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert") pod "ingress-canary-jrhbz" (UID: "d7a05ede-324e-4207-a7a9-c301663390b7") : secret "canary-serving-cert" not found Apr 17 20:46:35.674503 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.674464 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:46:35.674643 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:46:35.674535 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls podName:e29bb1be-edc2-47b7-8269-a7ceb57323f1 nodeName:}" failed. No retries permitted until 2026-04-17 20:48:37.674516944 +0000 UTC m=+283.473395088 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls") pod "dns-default-qf8m5" (UID: "e29bb1be-edc2-47b7-8269-a7ceb57323f1") : secret "dns-default-metrics-tls" not found Apr 17 20:46:37.013312 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.013259 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" podUID="f49e6d25-4883-4014-9729-d80699320182" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 17 20:46:37.297654 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.297571 2567 generic.go:358] "Generic (PLEG): container finished" podID="f49e6d25-4883-4014-9729-d80699320182" containerID="472abad9b0f528c329a50edb462f916b882d14695c1e3ced88b7d1df011b36ce" exitCode=1 Apr 17 20:46:37.297784 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.297645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" event={"ID":"f49e6d25-4883-4014-9729-d80699320182","Type":"ContainerDied","Data":"472abad9b0f528c329a50edb462f916b882d14695c1e3ced88b7d1df011b36ce"} Apr 17 20:46:37.298001 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.297975 2567 scope.go:117] "RemoveContainer" containerID="472abad9b0f528c329a50edb462f916b882d14695c1e3ced88b7d1df011b36ce" Apr 17 20:46:37.298854 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.298832 2567 generic.go:358] "Generic (PLEG): container finished" podID="97b4c177-e511-421a-a7da-585f78bf704b" containerID="105623c7f6661a2974a3a99e9cb6608ff70b02796f882791299bb13e49d1574e" exitCode=255 Apr 17 20:46:37.298910 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.298870 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" event={"ID":"97b4c177-e511-421a-a7da-585f78bf704b","Type":"ContainerDied","Data":"105623c7f6661a2974a3a99e9cb6608ff70b02796f882791299bb13e49d1574e"} Apr 17 20:46:37.299124 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.299108 2567 scope.go:117] "RemoveContainer" containerID="105623c7f6661a2974a3a99e9cb6608ff70b02796f882791299bb13e49d1574e" Apr 17 20:46:37.928310 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.928264 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" Apr 17 20:46:37.945413 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:37.945388 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:46:38.303796 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:38.303681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" event={"ID":"f49e6d25-4883-4014-9729-d80699320182","Type":"ContainerStarted","Data":"3dc99f187d9a288b2d5ffa606ef0a8436a2181a4543b6ba9c5f3a953af76372c"} Apr 17 20:46:38.304215 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:38.303903 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:46:38.304888 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:38.304865 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-864fd9cd8-x9tr2" Apr 17 20:46:38.305391 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:38.305372 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c848dbdfc-xhbmh" event={"ID":"97b4c177-e511-421a-a7da-585f78bf704b","Type":"ContainerStarted","Data":"f0cb7ab1d3b248523084f9e724583a5ad8c44371f56c2b962fabcb49767cf0b5"} Apr 17 20:46:41.708148 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:41.708118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:46:42.708502 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:42.708446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qf8m5" Apr 17 20:46:43.707804 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:43.707762 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:46:48.683052 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.683016 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nqw6n"] Apr 17 20:46:48.686553 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.686534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.688900 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.688877 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:46:48.689493 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.689465 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:46:48.689621 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.689529 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jrrs2\"" Apr 17 20:46:48.689621 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.689537 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:46:48.689621 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.689529 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:46:48.697229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.697206 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nqw6n"] Apr 17 20:46:48.770600 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.770577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.770600 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.770607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.770745 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.770694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-crio-socket\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.770745 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.770711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvlc\" (UniqueName: \"kubernetes.io/projected/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-api-access-drvlc\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.770834 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.770761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-data-volume\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.871940 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.871913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-crio-socket\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872057 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.871943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drvlc\" (UniqueName: \"kubernetes.io/projected/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-api-access-drvlc\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872057 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.871964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-data-volume\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872057 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.872039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-crio-socket\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872170 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.872120 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872170 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.872155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872240 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.872225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-data-volume\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.872616 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.872598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.874372 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.874355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.880170 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.880152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvlc\" (UniqueName: \"kubernetes.io/projected/71b8cadb-5b6a-4cfd-b79f-08bef397fb44-kube-api-access-drvlc\") pod \"insights-runtime-extractor-nqw6n\" (UID: \"71b8cadb-5b6a-4cfd-b79f-08bef397fb44\") " pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:48.995777 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:48.995703 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nqw6n" Apr 17 20:46:49.110488 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:49.110409 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nqw6n"] Apr 17 20:46:49.114055 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:46:49.114029 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b8cadb_5b6a_4cfd_b79f_08bef397fb44.slice/crio-123b78e58bd8de8708f4a892bd5c992b132c21a8d488fec2009738e1747c0c2c WatchSource:0}: Error finding container 123b78e58bd8de8708f4a892bd5c992b132c21a8d488fec2009738e1747c0c2c: Status 404 returned error can't find the container with id 123b78e58bd8de8708f4a892bd5c992b132c21a8d488fec2009738e1747c0c2c Apr 17 20:46:49.331651 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:49.331599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nqw6n" event={"ID":"71b8cadb-5b6a-4cfd-b79f-08bef397fb44","Type":"ContainerStarted","Data":"8a3fc1cae709e290c542c28a0bfc7a3fd2a5769dd096538f71650261bc43444d"} Apr 17 20:46:49.331651 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:49.331654 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nqw6n" event={"ID":"71b8cadb-5b6a-4cfd-b79f-08bef397fb44","Type":"ContainerStarted","Data":"123b78e58bd8de8708f4a892bd5c992b132c21a8d488fec2009738e1747c0c2c"} Apr 17 20:46:50.335593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:50.335554 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nqw6n" event={"ID":"71b8cadb-5b6a-4cfd-b79f-08bef397fb44","Type":"ContainerStarted","Data":"c9e33be4118388aaaa9bdcd515bb6d9f425be324c9a7c32294acc4e8c3f2adc2"} Apr 17 20:46:52.342162 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:52.342122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nqw6n" event={"ID":"71b8cadb-5b6a-4cfd-b79f-08bef397fb44","Type":"ContainerStarted","Data":"6820c677e3c405affb8c3487968d11d2197948711b8d3197b80ddbd3757b52bf"} Apr 17 20:46:52.357074 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:46:52.357029 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nqw6n" podStartSLOduration=2.128632408 podStartE2EDuration="4.357017213s" podCreationTimestamp="2026-04-17 20:46:48 +0000 UTC" firstStartedPulling="2026-04-17 20:46:49.163691564 +0000 UTC m=+174.962569709" lastFinishedPulling="2026-04-17 20:46:51.392076369 +0000 UTC m=+177.190954514" observedRunningTime="2026-04-17 20:46:52.356622873 +0000 UTC m=+178.155501038" watchObservedRunningTime="2026-04-17 20:46:52.357017213 +0000 UTC m=+178.155895379" Apr 17 20:47:02.294604 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.294573 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wdk4r"] Apr 17 20:47:02.298654 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.298636 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.301888 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.301864 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:47:02.302011 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.301873 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:47:02.302011 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.301944 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:47:02.302545 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.302527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:47:02.302634 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.302556 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:47:02.302699 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.302630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:47:02.302699 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.302668 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s8fns\"" Apr 17 20:47:02.367362 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367504 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-wtmp\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367504 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-sys\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367504 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-root\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367504 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-textfile\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367663 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367514 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-metrics-client-ca\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367663 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367663 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxd9\" (UniqueName: \"kubernetes.io/projected/fac65c87-d203-48d3-8dd0-754aef117237-kube-api-access-2hxd9\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.367663 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.367603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468171 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-root\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468184 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-textfile\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-metrics-client-ca\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-root\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxd9\" (UniqueName: \"kubernetes.io/projected/fac65c87-d203-48d3-8dd0-754aef117237-kube-api-access-2hxd9\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468353 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468499 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-wtmp\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:47:02.468509 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-sys\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-textfile\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:47:02.468579 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls podName:fac65c87-d203-48d3-8dd0-754aef117237 nodeName:}" failed. No retries permitted until 2026-04-17 20:47:02.968561102 +0000 UTC m=+188.767439252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls") pod "node-exporter-wdk4r" (UID: "fac65c87-d203-48d3-8dd0-754aef117237") : secret "node-exporter-tls" not found Apr 17 20:47:02.468691 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-sys\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468906 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-wtmp\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.468991 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.468969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-accelerators-collector-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.469403 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.469387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fac65c87-d203-48d3-8dd0-754aef117237-metrics-client-ca\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.470740 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.470723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.476726 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.476700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxd9\" (UniqueName: \"kubernetes.io/projected/fac65c87-d203-48d3-8dd0-754aef117237-kube-api-access-2hxd9\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.972243 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.972209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:02.974438 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:02.974415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fac65c87-d203-48d3-8dd0-754aef117237-node-exporter-tls\") pod \"node-exporter-wdk4r\" (UID: \"fac65c87-d203-48d3-8dd0-754aef117237\") " pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:03.207546 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:03.207518 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wdk4r" Apr 17 20:47:03.216350 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:47:03.216325 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac65c87_d203_48d3_8dd0_754aef117237.slice/crio-d9b1018bb0608c65e8e557a5a361ba71c5b540b0b8692fa5a30c3bbbf3d6d6c7 WatchSource:0}: Error finding container d9b1018bb0608c65e8e557a5a361ba71c5b540b0b8692fa5a30c3bbbf3d6d6c7: Status 404 returned error can't find the container with id d9b1018bb0608c65e8e557a5a361ba71c5b540b0b8692fa5a30c3bbbf3d6d6c7 Apr 17 20:47:03.371646 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:03.371613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdk4r" event={"ID":"fac65c87-d203-48d3-8dd0-754aef117237","Type":"ContainerStarted","Data":"d9b1018bb0608c65e8e557a5a361ba71c5b540b0b8692fa5a30c3bbbf3d6d6c7"} Apr 17 20:47:04.375529 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:04.375492 2567 generic.go:358] "Generic (PLEG): container finished" podID="fac65c87-d203-48d3-8dd0-754aef117237" containerID="fa800dffdf2aad641bdf04def12d170bad015bc8b0762ddb75fda318345c0df0" exitCode=0 Apr 17 20:47:04.375894 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:04.375552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdk4r" event={"ID":"fac65c87-d203-48d3-8dd0-754aef117237","Type":"ContainerDied","Data":"fa800dffdf2aad641bdf04def12d170bad015bc8b0762ddb75fda318345c0df0"} Apr 17 20:47:05.379380 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:05.379342 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdk4r" event={"ID":"fac65c87-d203-48d3-8dd0-754aef117237","Type":"ContainerStarted","Data":"41a22bc0e80b1d5456a61f6674d2edcd7c2f1d4588db0088ee4b6a3438cbd879"} Apr 17 20:47:05.379380 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:05.379382 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wdk4r" event={"ID":"fac65c87-d203-48d3-8dd0-754aef117237","Type":"ContainerStarted","Data":"cdaaa20ad87119875602bda1051393d7b35eb90acc795a560b0202b3c830e210"} Apr 17 20:47:05.399938 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:05.399883 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wdk4r" podStartSLOduration=2.547526092 podStartE2EDuration="3.399869707s" podCreationTimestamp="2026-04-17 20:47:02 +0000 UTC" firstStartedPulling="2026-04-17 20:47:03.218563082 +0000 UTC m=+189.017441242" lastFinishedPulling="2026-04-17 20:47:04.070906712 +0000 UTC m=+189.869784857" observedRunningTime="2026-04-17 20:47:05.398755323 +0000 UTC m=+191.197633490" watchObservedRunningTime="2026-04-17 20:47:05.399869707 +0000 UTC m=+191.198747916" Apr 17 20:47:07.969106 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:07.969068 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" podUID="da86b587-5c22-4e33-99b8-998971aa192e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:47:10.562709 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:10.562670 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d5594c5c6-nphhz"] Apr 17 20:47:10.563119 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:47:10.562865 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" podUID="cabe587e-db12-438a-a6c9-ccfe13aaaf19" Apr 17 20:47:11.392389 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.392357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:47:11.396483 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.396445 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:47:11.536278 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536252 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536376 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536287 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536376 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536310 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536376 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536340 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536552 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536392 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536552 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536422 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536552 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536486 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lw6b\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b\") pod \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\" (UID: \"cabe587e-db12-438a-a6c9-ccfe13aaaf19\") " Apr 17 20:47:11.536700 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536611 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:47:11.536770 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536746 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cabe587e-db12-438a-a6c9-ccfe13aaaf19-ca-trust-extracted\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.536824 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536758 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:47:11.536824 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.536794 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:47:11.538793 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.538765 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:47:11.538887 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.538847 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:47:11.538948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.538885 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b" (OuterVolumeSpecName: "kube-api-access-2lw6b") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "kube-api-access-2lw6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:47:11.538948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.538896 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cabe587e-db12-438a-a6c9-ccfe13aaaf19" (UID: "cabe587e-db12-438a-a6c9-ccfe13aaaf19"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:47:11.637750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637722 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lw6b\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-kube-api-access-2lw6b\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.637750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637746 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-certificates\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.637750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637757 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cabe587e-db12-438a-a6c9-ccfe13aaaf19-trusted-ca\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.638129 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637767 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-bound-sa-token\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.638129 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637777 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-image-registry-private-configuration\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:11.638129 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:11.637786 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cabe587e-db12-438a-a6c9-ccfe13aaaf19-installation-pull-secrets\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:12.395172 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:12.395140 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d5594c5c6-nphhz" Apr 17 20:47:12.421968 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:12.421939 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d5594c5c6-nphhz"] Apr 17 20:47:12.427508 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:12.427484 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d5594c5c6-nphhz"] Apr 17 20:47:12.544345 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:12.544322 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cabe587e-db12-438a-a6c9-ccfe13aaaf19-registry-tls\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:47:12.711648 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:12.711571 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabe587e-db12-438a-a6c9-ccfe13aaaf19" path="/var/lib/kubelet/pods/cabe587e-db12-438a-a6c9-ccfe13aaaf19/volumes" Apr 17 20:47:17.968438 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:17.968394 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" podUID="da86b587-5c22-4e33-99b8-998971aa192e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:47:27.968969 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:27.968925 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" podUID="da86b587-5c22-4e33-99b8-998971aa192e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:47:27.969336 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:27.968996 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" Apr 17 20:47:27.969522 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:27.969489 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"74a17712172cc1bffcc2780f0d11b64eb2162a4227f830c1b730e00ab4f7bd4f"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 20:47:27.969566 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:27.969543 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" podUID="da86b587-5c22-4e33-99b8-998971aa192e" containerName="service-proxy" containerID="cri-o://74a17712172cc1bffcc2780f0d11b64eb2162a4227f830c1b730e00ab4f7bd4f" gracePeriod=30 Apr 17 20:47:28.435195 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:28.435161 2567 generic.go:358] "Generic (PLEG): container finished" podID="da86b587-5c22-4e33-99b8-998971aa192e" containerID="74a17712172cc1bffcc2780f0d11b64eb2162a4227f830c1b730e00ab4f7bd4f" exitCode=2 Apr 17 20:47:28.435354 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:28.435236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerDied","Data":"74a17712172cc1bffcc2780f0d11b64eb2162a4227f830c1b730e00ab4f7bd4f"} Apr 17 20:47:28.435354 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:47:28.435273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7bddb9ffd8-zljvz" event={"ID":"da86b587-5c22-4e33-99b8-998971aa192e","Type":"ContainerStarted","Data":"6bc76e25b2053932d575c70ff1e68c5c702cb0302dcd1535e38db0f8281181bf"} Apr 17 20:48:05.513694 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:05.513650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:48:05.515936 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:05.515911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b84b134c-9465-48d2-b811-36203ae88de2-metrics-certs\") pod \"network-metrics-daemon-mxwcv\" (UID: \"b84b134c-9465-48d2-b811-36203ae88de2\") " pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:48:05.611229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:05.611201 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jzn7k\"" Apr 17 20:48:05.619788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:05.619765 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mxwcv" Apr 17 20:48:05.730350 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:05.730320 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mxwcv"] Apr 17 20:48:05.733386 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:48:05.733351 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84b134c_9465_48d2_b811_36203ae88de2.slice/crio-60aeb751a7f74e16ac29ca210114f9360c282b51e635fbf8becc2dbdfa8e5b18 WatchSource:0}: Error finding container 60aeb751a7f74e16ac29ca210114f9360c282b51e635fbf8becc2dbdfa8e5b18: Status 404 returned error can't find the container with id 60aeb751a7f74e16ac29ca210114f9360c282b51e635fbf8becc2dbdfa8e5b18 Apr 17 20:48:06.528710 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:06.528669 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mxwcv" event={"ID":"b84b134c-9465-48d2-b811-36203ae88de2","Type":"ContainerStarted","Data":"60aeb751a7f74e16ac29ca210114f9360c282b51e635fbf8becc2dbdfa8e5b18"} Apr 17 20:48:07.536209 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:07.536173 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mxwcv" event={"ID":"b84b134c-9465-48d2-b811-36203ae88de2","Type":"ContainerStarted","Data":"b4af694f5e9eb5341c9fbac443cd14bbd257bbd49816ddac880a63b78fb4f561"} Apr 17 20:48:07.536209 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:07.536213 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mxwcv" event={"ID":"b84b134c-9465-48d2-b811-36203ae88de2","Type":"ContainerStarted","Data":"cbb3cde797dc68ad8eaaa1853d5fd2b46c4ba32e60bf0167919a5b2a31a05e1f"} Apr 17 20:48:07.549850 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:07.549800 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mxwcv" podStartSLOduration=252.487713048 podStartE2EDuration="4m13.549788429s" podCreationTimestamp="2026-04-17 20:43:54 +0000 UTC" firstStartedPulling="2026-04-17 20:48:05.735167717 +0000 UTC m=+251.534045876" lastFinishedPulling="2026-04-17 20:48:06.797243113 +0000 UTC m=+252.596121257" observedRunningTime="2026-04-17 20:48:07.549242586 +0000 UTC m=+253.348120753" watchObservedRunningTime="2026-04-17 20:48:07.549788429 +0000 UTC m=+253.348666595" Apr 17 20:48:34.285054 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:48:34.284996 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jrhbz" podUID="d7a05ede-324e-4207-a7a9-c301663390b7" Apr 17 20:48:34.285054 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:48:34.285032 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" podUID="23b78edd-8569-4781-bf46-bc649a833595" Apr 17 20:48:34.601366 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:34.601330 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:48:34.601542 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:34.601330 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:48:37.643304 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.643265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:48:37.645765 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.645741 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/23b78edd-8569-4781-bf46-bc649a833595-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nd22j\" (UID: \"23b78edd-8569-4781-bf46-bc649a833595\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:48:37.743862 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.743833 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:48:37.743997 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.743867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:48:37.746059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.746024 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e29bb1be-edc2-47b7-8269-a7ceb57323f1-metrics-tls\") pod \"dns-default-qf8m5\" (UID: \"e29bb1be-edc2-47b7-8269-a7ceb57323f1\") " pod="openshift-dns/dns-default-qf8m5" Apr 17 20:48:37.746187 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.746171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7a05ede-324e-4207-a7a9-c301663390b7-cert\") pod \"ingress-canary-jrhbz\" (UID: \"d7a05ede-324e-4207-a7a9-c301663390b7\") " pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:48:37.904974 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.904892 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-76z4h\"" Apr 17 20:48:37.905209 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.905193 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pn5wd\"" Apr 17 20:48:37.910982 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.910959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8pvd4\"" Apr 17 20:48:37.913182 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.913168 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" Apr 17 20:48:37.913231 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.913183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jrhbz" Apr 17 20:48:37.919856 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:37.919836 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qf8m5" Apr 17 20:48:38.047750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.047703 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nd22j"] Apr 17 20:48:38.051689 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:48:38.051648 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b78edd_8569_4781_bf46_bc649a833595.slice/crio-92db0f531d502aaefa1f80acdd7677c89ddc4b6b12474057f5a06c146444bab3 WatchSource:0}: Error finding container 92db0f531d502aaefa1f80acdd7677c89ddc4b6b12474057f5a06c146444bab3: Status 404 returned error can't find the container with id 92db0f531d502aaefa1f80acdd7677c89ddc4b6b12474057f5a06c146444bab3 Apr 17 20:48:38.065066 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.065041 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jrhbz"] Apr 17 20:48:38.067746 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:48:38.067717 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a05ede_324e_4207_a7a9_c301663390b7.slice/crio-f70c9fea35c4e86d10ba0045c1ba8bbcec33493283427bcce6b2992be9e32840 WatchSource:0}: Error finding container f70c9fea35c4e86d10ba0045c1ba8bbcec33493283427bcce6b2992be9e32840: Status 404 returned error can't find the container with id f70c9fea35c4e86d10ba0045c1ba8bbcec33493283427bcce6b2992be9e32840 Apr 17 20:48:38.083273 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.083252 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qf8m5"] Apr 17 20:48:38.085081 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:48:38.085060 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode29bb1be_edc2_47b7_8269_a7ceb57323f1.slice/crio-49f204527c22bf750167ae54abbe3c8edf9397880407f0c045589da81dcd342f WatchSource:0}: Error finding container 49f204527c22bf750167ae54abbe3c8edf9397880407f0c045589da81dcd342f: Status 404 returned error can't find the container with id 49f204527c22bf750167ae54abbe3c8edf9397880407f0c045589da81dcd342f Apr 17 20:48:38.612216 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.612140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" event={"ID":"23b78edd-8569-4781-bf46-bc649a833595","Type":"ContainerStarted","Data":"92db0f531d502aaefa1f80acdd7677c89ddc4b6b12474057f5a06c146444bab3"} Apr 17 20:48:38.613637 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.613608 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrhbz" event={"ID":"d7a05ede-324e-4207-a7a9-c301663390b7","Type":"ContainerStarted","Data":"f70c9fea35c4e86d10ba0045c1ba8bbcec33493283427bcce6b2992be9e32840"} Apr 17 20:48:38.615163 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:38.615137 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qf8m5" event={"ID":"e29bb1be-edc2-47b7-8269-a7ceb57323f1","Type":"ContainerStarted","Data":"49f204527c22bf750167ae54abbe3c8edf9397880407f0c045589da81dcd342f"} Apr 17 20:48:40.622262 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.622227 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jrhbz" event={"ID":"d7a05ede-324e-4207-a7a9-c301663390b7","Type":"ContainerStarted","Data":"3ecf1dc62dee998f08e483dc1e948c3566b0fa56a7431328b7a7da6bf12614ce"} Apr 17 20:48:40.623763 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.623741 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qf8m5" event={"ID":"e29bb1be-edc2-47b7-8269-a7ceb57323f1","Type":"ContainerStarted","Data":"0dfc498431a2f79198a3095156a1eab4a450f148b89946ec7c6cf37aeb877061"} Apr 17 20:48:40.623862 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.623772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qf8m5" event={"ID":"e29bb1be-edc2-47b7-8269-a7ceb57323f1","Type":"ContainerStarted","Data":"b9dc426e001e3d24f8ac833ca0e24fe3c16f12b83b060069142104003ec5ea8c"} Apr 17 20:48:40.623929 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.623871 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qf8m5" Apr 17 20:48:40.625038 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.625014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" event={"ID":"23b78edd-8569-4781-bf46-bc649a833595","Type":"ContainerStarted","Data":"c2bb2903e7baf2b999170b3c0041b2d855cbc874b8afcf23a1528e48f08fe9f2"} Apr 17 20:48:40.635977 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.635936 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jrhbz" podStartSLOduration=251.637489765 podStartE2EDuration="4m13.635924054s" podCreationTimestamp="2026-04-17 20:44:27 +0000 UTC" firstStartedPulling="2026-04-17 20:48:38.069950513 +0000 UTC m=+283.868828657" lastFinishedPulling="2026-04-17 20:48:40.068384794 +0000 UTC m=+285.867262946" observedRunningTime="2026-04-17 20:48:40.634644614 +0000 UTC m=+286.433522783" watchObservedRunningTime="2026-04-17 20:48:40.635924054 +0000 UTC m=+286.434802250" Apr 17 20:48:40.653903 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.653825 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nd22j" podStartSLOduration=270.648857019 podStartE2EDuration="4m32.653812134s" podCreationTimestamp="2026-04-17 20:44:08 +0000 UTC" firstStartedPulling="2026-04-17 20:48:38.053795437 +0000 UTC m=+283.852673589" lastFinishedPulling="2026-04-17 20:48:40.058750556 +0000 UTC m=+285.857628704" observedRunningTime="2026-04-17 20:48:40.653001565 +0000 UTC m=+286.451879731" watchObservedRunningTime="2026-04-17 20:48:40.653812134 +0000 UTC m=+286.452690304" Apr 17 20:48:40.668906 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:40.668866 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qf8m5" podStartSLOduration=251.691625703 podStartE2EDuration="4m13.668852936s" podCreationTimestamp="2026-04-17 20:44:27 +0000 UTC" firstStartedPulling="2026-04-17 20:48:38.086753007 +0000 UTC m=+283.885631155" lastFinishedPulling="2026-04-17 20:48:40.06398024 +0000 UTC m=+285.862858388" observedRunningTime="2026-04-17 20:48:40.668117568 +0000 UTC m=+286.466995736" watchObservedRunningTime="2026-04-17 20:48:40.668852936 +0000 UTC m=+286.467731105" Apr 17 20:48:50.629903 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:50.629872 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qf8m5" Apr 17 20:48:54.589483 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:48:54.589444 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:51:12.777666 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.777583 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff"] Apr 17 20:51:12.780472 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.780439 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.782396 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.782371 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:51:12.782888 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.782869 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-xrktv\"" Apr 17 20:51:12.782888 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.782877 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:51:12.787601 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.787220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff"] Apr 17 20:51:12.855371 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.855346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dccb5093-e348-4b95-9ba7-55b96602e17f-tmp\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.855480 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.855409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nv8\" (UniqueName: \"kubernetes.io/projected/dccb5093-e348-4b95-9ba7-55b96602e17f-kube-api-access-l5nv8\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.956420 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.956400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nv8\" (UniqueName: \"kubernetes.io/projected/dccb5093-e348-4b95-9ba7-55b96602e17f-kube-api-access-l5nv8\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.956557 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.956432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dccb5093-e348-4b95-9ba7-55b96602e17f-tmp\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.956796 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.956774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dccb5093-e348-4b95-9ba7-55b96602e17f-tmp\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:12.963177 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:12.963157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nv8\" (UniqueName: \"kubernetes.io/projected/dccb5093-e348-4b95-9ba7-55b96602e17f-kube-api-access-l5nv8\") pod \"openshift-lws-operator-bfc7f696d-m5xff\" (UID: \"dccb5093-e348-4b95-9ba7-55b96602e17f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:13.090664 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:13.090630 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" Apr 17 20:51:13.204309 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:13.204274 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff"] Apr 17 20:51:13.207337 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:51:13.207310 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddccb5093_e348_4b95_9ba7_55b96602e17f.slice/crio-79393db11e3f3fd013f100a3e224a71b8d057482d0c463f6ee8b40fac220b51d WatchSource:0}: Error finding container 79393db11e3f3fd013f100a3e224a71b8d057482d0c463f6ee8b40fac220b51d: Status 404 returned error can't find the container with id 79393db11e3f3fd013f100a3e224a71b8d057482d0c463f6ee8b40fac220b51d Apr 17 20:51:13.208818 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:13.208802 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:51:14.016266 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:14.016225 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" event={"ID":"dccb5093-e348-4b95-9ba7-55b96602e17f","Type":"ContainerStarted","Data":"79393db11e3f3fd013f100a3e224a71b8d057482d0c463f6ee8b40fac220b51d"} Apr 17 20:51:16.023417 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:16.023380 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" event={"ID":"dccb5093-e348-4b95-9ba7-55b96602e17f","Type":"ContainerStarted","Data":"d6a7139c74a45b84bbecdcf6d349560d1c54d69b9b18d132825617eb742b090e"} Apr 17 20:51:16.036896 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:16.036835 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-m5xff" podStartSLOduration=1.894408872 podStartE2EDuration="4.036819429s" podCreationTimestamp="2026-04-17 20:51:12 +0000 UTC" firstStartedPulling="2026-04-17 20:51:13.208920488 +0000 UTC m=+439.007798632" lastFinishedPulling="2026-04-17 20:51:15.351331044 +0000 UTC m=+441.150209189" observedRunningTime="2026-04-17 20:51:16.036411123 +0000 UTC m=+441.835289291" watchObservedRunningTime="2026-04-17 20:51:16.036819429 +0000 UTC m=+441.835697595" Apr 17 20:51:35.071739 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.071708 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz"] Apr 17 20:51:35.075117 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.075095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.076992 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.076957 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:51:35.077108 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.077029 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:51:35.077166 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.077115 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-qjnvg\"" Apr 17 20:51:35.077207 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.077174 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:51:35.077249 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.077174 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:51:35.094972 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.094949 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz"] Apr 17 20:51:35.204488 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.204440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.204628 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.204506 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj4bv\" (UniqueName: \"kubernetes.io/projected/e113327f-ec56-4733-9105-3369eb3947b4-kube-api-access-hj4bv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.204693 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.204630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.305688 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.305662 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.305788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.305692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.305788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.305717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj4bv\" (UniqueName: \"kubernetes.io/projected/e113327f-ec56-4733-9105-3369eb3947b4-kube-api-access-hj4bv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.308110 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.308090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.308189 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.308124 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e113327f-ec56-4733-9105-3369eb3947b4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.318514 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.318494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj4bv\" (UniqueName: \"kubernetes.io/projected/e113327f-ec56-4733-9105-3369eb3947b4-kube-api-access-hj4bv\") pod \"opendatahub-operator-controller-manager-6dc4849f89-ndxfz\" (UID: \"e113327f-ec56-4733-9105-3369eb3947b4\") " pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.385199 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.385178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:35.505855 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:35.505831 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz"] Apr 17 20:51:35.508932 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:51:35.508902 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode113327f_ec56_4733_9105_3369eb3947b4.slice/crio-32cf11a254f8370097fd7abdec79fa26588fda024d92d7cbd0e8ed27f5392532 WatchSource:0}: Error finding container 32cf11a254f8370097fd7abdec79fa26588fda024d92d7cbd0e8ed27f5392532: Status 404 returned error can't find the container with id 32cf11a254f8370097fd7abdec79fa26588fda024d92d7cbd0e8ed27f5392532 Apr 17 20:51:36.074251 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:36.074202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" event={"ID":"e113327f-ec56-4733-9105-3369eb3947b4","Type":"ContainerStarted","Data":"32cf11a254f8370097fd7abdec79fa26588fda024d92d7cbd0e8ed27f5392532"} Apr 17 20:51:39.082816 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:39.082776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" event={"ID":"e113327f-ec56-4733-9105-3369eb3947b4","Type":"ContainerStarted","Data":"a83b931c3fb1f56f24ccee2c37bd81de2978813998f5fa9b88a9c10e3579d66d"} Apr 17 20:51:39.083213 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:39.082891 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:39.099796 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:39.099750 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" podStartSLOduration=1.558967699 podStartE2EDuration="4.099738325s" podCreationTimestamp="2026-04-17 20:51:35 +0000 UTC" firstStartedPulling="2026-04-17 20:51:35.510539792 +0000 UTC m=+461.309417939" lastFinishedPulling="2026-04-17 20:51:38.051310419 +0000 UTC m=+463.850188565" observedRunningTime="2026-04-17 20:51:39.098183325 +0000 UTC m=+464.897061490" watchObservedRunningTime="2026-04-17 20:51:39.099738325 +0000 UTC m=+464.898616490" Apr 17 20:51:50.087243 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:50.087213 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6dc4849f89-ndxfz" Apr 17 20:51:52.949993 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.949958 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-674746b5f4-g87sv"] Apr 17 20:51:52.953736 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.953715 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:52.955524 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.955502 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:51:52.956139 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.956120 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:51:52.956139 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.956129 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k8cwx\"" Apr 17 20:51:52.956277 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.956176 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:51:52.956277 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.956129 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:51:52.963388 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:52.963366 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-674746b5f4-g87sv"] Apr 17 20:51:53.122121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.122091 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df44c2b0-79cb-48a8-95fd-7c810d41b946-tmp\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.122272 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.122129 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5b7d\" (UniqueName: \"kubernetes.io/projected/df44c2b0-79cb-48a8-95fd-7c810d41b946-kube-api-access-c5b7d\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.122272 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.122179 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df44c2b0-79cb-48a8-95fd-7c810d41b946-tls-certs\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.223549 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.223463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5b7d\" (UniqueName: \"kubernetes.io/projected/df44c2b0-79cb-48a8-95fd-7c810d41b946-kube-api-access-c5b7d\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.223549 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.223513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df44c2b0-79cb-48a8-95fd-7c810d41b946-tls-certs\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.223549 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.223551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df44c2b0-79cb-48a8-95fd-7c810d41b946-tmp\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.225751 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.225717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df44c2b0-79cb-48a8-95fd-7c810d41b946-tmp\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.225941 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.225922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df44c2b0-79cb-48a8-95fd-7c810d41b946-tls-certs\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.230540 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.230518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5b7d\" (UniqueName: \"kubernetes.io/projected/df44c2b0-79cb-48a8-95fd-7c810d41b946-kube-api-access-c5b7d\") pod \"kube-auth-proxy-674746b5f4-g87sv\" (UID: \"df44c2b0-79cb-48a8-95fd-7c810d41b946\") " pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.263250 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.263218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" Apr 17 20:51:53.374513 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:53.374489 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-674746b5f4-g87sv"] Apr 17 20:51:53.377216 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:51:53.377184 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf44c2b0_79cb_48a8_95fd_7c810d41b946.slice/crio-1f9ce633ddd77375b77bcd16d25012c76ce502b9e80bace4105ce8a1965ed1db WatchSource:0}: Error finding container 1f9ce633ddd77375b77bcd16d25012c76ce502b9e80bace4105ce8a1965ed1db: Status 404 returned error can't find the container with id 1f9ce633ddd77375b77bcd16d25012c76ce502b9e80bace4105ce8a1965ed1db Apr 17 20:51:54.124141 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:54.124092 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" event={"ID":"df44c2b0-79cb-48a8-95fd-7c810d41b946","Type":"ContainerStarted","Data":"1f9ce633ddd77375b77bcd16d25012c76ce502b9e80bace4105ce8a1965ed1db"} Apr 17 20:51:56.434999 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.434954 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jc74h"] Apr 17 20:51:56.437927 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.437905 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.439821 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.439751 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 20:51:56.439942 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.439850 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-gnjl6\"" Apr 17 20:51:56.444050 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.444026 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jc74h"] Apr 17 20:51:56.449013 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.448987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvpb\" (UniqueName: \"kubernetes.io/projected/37ba6f55-8094-4ab4-b24b-6330f49f9c07-kube-api-access-vlvpb\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.449119 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.449069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.550145 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.550110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.550338 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.550167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvpb\" (UniqueName: \"kubernetes.io/projected/37ba6f55-8094-4ab4-b24b-6330f49f9c07-kube-api-access-vlvpb\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.550338 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:51:56.550277 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 20:51:56.550476 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:51:56.550349 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert podName:37ba6f55-8094-4ab4-b24b-6330f49f9c07 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:57.050328062 +0000 UTC m=+482.849206219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert") pod "odh-model-controller-858dbf95b8-jc74h" (UID: "37ba6f55-8094-4ab4-b24b-6330f49f9c07") : secret "odh-model-controller-webhook-cert" not found Apr 17 20:51:56.560666 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.560637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvpb\" (UniqueName: \"kubernetes.io/projected/37ba6f55-8094-4ab4-b24b-6330f49f9c07-kube-api-access-vlvpb\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:56.798361 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:56.798338 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:51:57.053345 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.053282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:57.055611 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.055580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37ba6f55-8094-4ab4-b24b-6330f49f9c07-cert\") pod \"odh-model-controller-858dbf95b8-jc74h\" (UID: \"37ba6f55-8094-4ab4-b24b-6330f49f9c07\") " pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:57.133707 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.133670 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" event={"ID":"df44c2b0-79cb-48a8-95fd-7c810d41b946","Type":"ContainerStarted","Data":"1a42c55bf065ede207272657e9679316c3495239442bd34fb1a16ae26d5d3e44"} Apr 17 20:51:57.148259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.148215 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-674746b5f4-g87sv" podStartSLOduration=1.731087782 podStartE2EDuration="5.148201419s" podCreationTimestamp="2026-04-17 20:51:52 +0000 UTC" firstStartedPulling="2026-04-17 20:51:53.379329742 +0000 UTC m=+479.178207885" lastFinishedPulling="2026-04-17 20:51:56.796443367 +0000 UTC m=+482.595321522" observedRunningTime="2026-04-17 20:51:57.147100459 +0000 UTC m=+482.945978626" watchObservedRunningTime="2026-04-17 20:51:57.148201419 +0000 UTC m=+482.947079584" Apr 17 20:51:57.351793 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.351758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:51:57.468893 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:57.468857 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jc74h"] Apr 17 20:51:57.471724 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:51:57.471695 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ba6f55_8094_4ab4_b24b_6330f49f9c07.slice/crio-62463b9acbef1c62ce92ac322efacb8b77fc3b09dedee0dec1ba0835eb04e276 WatchSource:0}: Error finding container 62463b9acbef1c62ce92ac322efacb8b77fc3b09dedee0dec1ba0835eb04e276: Status 404 returned error can't find the container with id 62463b9acbef1c62ce92ac322efacb8b77fc3b09dedee0dec1ba0835eb04e276 Apr 17 20:51:58.138051 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:51:58.138012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" event={"ID":"37ba6f55-8094-4ab4-b24b-6330f49f9c07","Type":"ContainerStarted","Data":"62463b9acbef1c62ce92ac322efacb8b77fc3b09dedee0dec1ba0835eb04e276"} Apr 17 20:52:01.147805 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:01.147770 2567 generic.go:358] "Generic (PLEG): container finished" podID="37ba6f55-8094-4ab4-b24b-6330f49f9c07" containerID="15084bc64ea53af7d6368f1e8a6d831c35aa35e38c432993570b6865ca5a1146" exitCode=1 Apr 17 20:52:01.148172 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:01.147854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" event={"ID":"37ba6f55-8094-4ab4-b24b-6330f49f9c07","Type":"ContainerDied","Data":"15084bc64ea53af7d6368f1e8a6d831c35aa35e38c432993570b6865ca5a1146"} Apr 17 20:52:01.148172 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:01.148037 2567 scope.go:117] "RemoveContainer" containerID="15084bc64ea53af7d6368f1e8a6d831c35aa35e38c432993570b6865ca5a1146" Apr 17 20:52:02.153771 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.153742 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-wwhfg"] Apr 17 20:52:02.154650 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.154625 2567 generic.go:358] "Generic (PLEG): container finished" podID="37ba6f55-8094-4ab4-b24b-6330f49f9c07" containerID="9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464" exitCode=1 Apr 17 20:52:02.156983 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.156959 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" event={"ID":"37ba6f55-8094-4ab4-b24b-6330f49f9c07","Type":"ContainerDied","Data":"9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464"} Apr 17 20:52:02.157098 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.156996 2567 scope.go:117] "RemoveContainer" containerID="15084bc64ea53af7d6368f1e8a6d831c35aa35e38c432993570b6865ca5a1146" Apr 17 20:52:02.157222 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.157198 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.157384 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.157365 2567 scope.go:117] "RemoveContainer" containerID="9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464" Apr 17 20:52:02.157718 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:52:02.157697 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jc74h_opendatahub(37ba6f55-8094-4ab4-b24b-6330f49f9c07)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" podUID="37ba6f55-8094-4ab4-b24b-6330f49f9c07" Apr 17 20:52:02.159479 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.159394 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 20:52:02.159588 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.159534 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-t98ww\"" Apr 17 20:52:02.163350 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.163330 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-wwhfg"] Apr 17 20:52:02.286238 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.286214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.286370 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.286246 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rbw\" (UniqueName: \"kubernetes.io/projected/418163da-535c-405a-bff0-684797bf68e4-kube-api-access-q5rbw\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.386541 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.386512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.386651 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.386550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rbw\" (UniqueName: \"kubernetes.io/projected/418163da-535c-405a-bff0-684797bf68e4-kube-api-access-q5rbw\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.386716 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:52:02.386643 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 20:52:02.386716 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:52:02.386709 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert podName:418163da-535c-405a-bff0-684797bf68e4 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:02.886686325 +0000 UTC m=+488.685564469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert") pod "kserve-controller-manager-856948b99f-wwhfg" (UID: "418163da-535c-405a-bff0-684797bf68e4") : secret "kserve-webhook-server-cert" not found Apr 17 20:52:02.393937 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.393911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rbw\" (UniqueName: \"kubernetes.io/projected/418163da-535c-405a-bff0-684797bf68e4-kube-api-access-q5rbw\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.889958 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.889923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:02.892241 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:02.892214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418163da-535c-405a-bff0-684797bf68e4-cert\") pod \"kserve-controller-manager-856948b99f-wwhfg\" (UID: \"418163da-535c-405a-bff0-684797bf68e4\") " pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:03.071356 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:03.071318 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:03.160407 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:03.160382 2567 scope.go:117] "RemoveContainer" containerID="9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464" Apr 17 20:52:03.160834 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:52:03.160608 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jc74h_opendatahub(37ba6f55-8094-4ab4-b24b-6330f49f9c07)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" podUID="37ba6f55-8094-4ab4-b24b-6330f49f9c07" Apr 17 20:52:03.192373 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:03.192347 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-wwhfg"] Apr 17 20:52:03.195145 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:52:03.195113 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418163da_535c_405a_bff0_684797bf68e4.slice/crio-64994c4af32b88ee35f2928fb2d23aeaefc3e035846875072de3178e25fc868c WatchSource:0}: Error finding container 64994c4af32b88ee35f2928fb2d23aeaefc3e035846875072de3178e25fc868c: Status 404 returned error can't find the container with id 64994c4af32b88ee35f2928fb2d23aeaefc3e035846875072de3178e25fc868c Apr 17 20:52:04.165086 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:04.165051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" event={"ID":"418163da-535c-405a-bff0-684797bf68e4","Type":"ContainerStarted","Data":"64994c4af32b88ee35f2928fb2d23aeaefc3e035846875072de3178e25fc868c"} Apr 17 20:52:06.173064 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:06.173026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" event={"ID":"418163da-535c-405a-bff0-684797bf68e4","Type":"ContainerStarted","Data":"a95fcb4ad041d301d5acdfdcd51f30eed5393ff1dfa64c8eafd00e457e771116"} Apr 17 20:52:06.173419 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:06.173092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:52:06.186318 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:06.186257 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" podStartSLOduration=1.7533851 podStartE2EDuration="4.186238668s" podCreationTimestamp="2026-04-17 20:52:02 +0000 UTC" firstStartedPulling="2026-04-17 20:52:03.196438736 +0000 UTC m=+488.995316897" lastFinishedPulling="2026-04-17 20:52:05.629292318 +0000 UTC m=+491.428170465" observedRunningTime="2026-04-17 20:52:06.186034711 +0000 UTC m=+491.984912877" watchObservedRunningTime="2026-04-17 20:52:06.186238668 +0000 UTC m=+491.985116835" Apr 17 20:52:07.352051 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:07.352020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:52:07.352416 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:07.352357 2567 scope.go:117] "RemoveContainer" containerID="9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464" Apr 17 20:52:07.352544 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:52:07.352526 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jc74h_opendatahub(37ba6f55-8094-4ab4-b24b-6330f49f9c07)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" podUID="37ba6f55-8094-4ab4-b24b-6330f49f9c07" Apr 17 20:52:11.842501 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.842467 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr"] Apr 17 20:52:11.851939 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.851918 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:11.853972 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.853948 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 20:52:11.854266 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.854249 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 20:52:11.854583 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.854570 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-dq75v\"" Apr 17 20:52:11.859400 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.859378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr"] Apr 17 20:52:11.947576 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.947539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rm4f\" (UniqueName: \"kubernetes.io/projected/8281887c-6d85-45cc-a3ce-0dc0012635e8-kube-api-access-9rm4f\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:11.947719 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:11.947587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8281887c-6d85-45cc-a3ce-0dc0012635e8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.048572 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.048546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rm4f\" (UniqueName: \"kubernetes.io/projected/8281887c-6d85-45cc-a3ce-0dc0012635e8-kube-api-access-9rm4f\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.048687 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.048589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8281887c-6d85-45cc-a3ce-0dc0012635e8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.051093 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.051073 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8281887c-6d85-45cc-a3ce-0dc0012635e8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.056829 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.056807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rm4f\" (UniqueName: \"kubernetes.io/projected/8281887c-6d85-45cc-a3ce-0dc0012635e8-kube-api-access-9rm4f\") pod \"servicemesh-operator3-55f49c5f94-gb7nr\" (UID: \"8281887c-6d85-45cc-a3ce-0dc0012635e8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.161297 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.161247 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:12.276775 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:12.276750 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr"] Apr 17 20:52:12.279206 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:52:12.279173 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8281887c_6d85_45cc_a3ce_0dc0012635e8.slice/crio-99763c60c6211180bf088b581db832d5be5c31955df1b0d30eaa9868e034eb29 WatchSource:0}: Error finding container 99763c60c6211180bf088b581db832d5be5c31955df1b0d30eaa9868e034eb29: Status 404 returned error can't find the container with id 99763c60c6211180bf088b581db832d5be5c31955df1b0d30eaa9868e034eb29 Apr 17 20:52:13.196297 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:13.196260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" event={"ID":"8281887c-6d85-45cc-a3ce-0dc0012635e8","Type":"ContainerStarted","Data":"99763c60c6211180bf088b581db832d5be5c31955df1b0d30eaa9868e034eb29"} Apr 17 20:52:15.204257 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:15.204143 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" event={"ID":"8281887c-6d85-45cc-a3ce-0dc0012635e8","Type":"ContainerStarted","Data":"d51cf4567f741c8d3bcc70851ab1cc497b377b5dc9bf058046f427a00a40f3ed"} Apr 17 20:52:15.204661 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:15.204263 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:17.352070 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.352030 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:52:17.352445 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.352375 2567 scope.go:117] "RemoveContainer" containerID="9c3cac15fb8b59d2f9d59e8787d0e675e52d9a324ecfc0e5c521a028b01e0464" Apr 17 20:52:17.453793 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.453736 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" podStartSLOduration=3.80331788 podStartE2EDuration="6.453719078s" podCreationTimestamp="2026-04-17 20:52:11 +0000 UTC" firstStartedPulling="2026-04-17 20:52:12.281764859 +0000 UTC m=+498.080643016" lastFinishedPulling="2026-04-17 20:52:14.932166067 +0000 UTC m=+500.731044214" observedRunningTime="2026-04-17 20:52:15.22499089 +0000 UTC m=+501.023869051" watchObservedRunningTime="2026-04-17 20:52:17.453719078 +0000 UTC m=+503.252597245" Apr 17 20:52:17.455464 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.455414 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6"] Apr 17 20:52:17.459148 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.459128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.460913 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.460888 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 20:52:17.461016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.460888 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:52:17.461016 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.460896 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 20:52:17.461119 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.461091 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 20:52:17.461294 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.461269 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-2l4x5\"" Apr 17 20:52:17.472508 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.472487 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6"] Apr 17 20:52:17.485717 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485802 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485733 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485802 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485883 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485814 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqkp\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-kube-api-access-thqkp\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485883 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/90dfe00c-70eb-4a27-bcea-7f82655ba36c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485883 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.485969 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.485952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586765 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586765 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586985 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586985 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqkp\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-kube-api-access-thqkp\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586985 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/90dfe00c-70eb-4a27-bcea-7f82655ba36c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586985 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.586985 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.586934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.587466 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.587415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.589275 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.589254 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/90dfe00c-70eb-4a27-bcea-7f82655ba36c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.589275 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.589271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.589465 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.589428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.589524 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.589492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/90dfe00c-70eb-4a27-bcea-7f82655ba36c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.594199 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.594176 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.594344 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.594326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqkp\" (UniqueName: \"kubernetes.io/projected/90dfe00c-70eb-4a27-bcea-7f82655ba36c-kube-api-access-thqkp\") pod \"istiod-openshift-gateway-55ff986f96-vv4x6\" (UID: \"90dfe00c-70eb-4a27-bcea-7f82655ba36c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.768969 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.768935 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:17.891147 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:17.891077 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6"] Apr 17 20:52:17.894705 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:52:17.894666 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90dfe00c_70eb_4a27_bcea_7f82655ba36c.slice/crio-bce4e963dfa2d22a4c621499506a673a71e210fd5dfef7f37c90022a88aaa71f WatchSource:0}: Error finding container bce4e963dfa2d22a4c621499506a673a71e210fd5dfef7f37c90022a88aaa71f: Status 404 returned error can't find the container with id bce4e963dfa2d22a4c621499506a673a71e210fd5dfef7f37c90022a88aaa71f Apr 17 20:52:18.216367 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:18.216282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" event={"ID":"90dfe00c-70eb-4a27-bcea-7f82655ba36c","Type":"ContainerStarted","Data":"bce4e963dfa2d22a4c621499506a673a71e210fd5dfef7f37c90022a88aaa71f"} Apr 17 20:52:18.218195 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:18.218161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" event={"ID":"37ba6f55-8094-4ab4-b24b-6330f49f9c07","Type":"ContainerStarted","Data":"ef6fafeaec0ab06ea61d2570206ba09ebdbe552bec014ca9804b0c08be1ef9a4"} Apr 17 20:52:18.218959 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:18.218934 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:52:18.234408 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:18.234361 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" podStartSLOduration=2.085162912 podStartE2EDuration="22.234348366s" podCreationTimestamp="2026-04-17 20:51:56 +0000 UTC" firstStartedPulling="2026-04-17 20:51:57.472951683 +0000 UTC m=+483.271829826" lastFinishedPulling="2026-04-17 20:52:17.622137136 +0000 UTC m=+503.421015280" observedRunningTime="2026-04-17 20:52:18.233255034 +0000 UTC m=+504.032133200" watchObservedRunningTime="2026-04-17 20:52:18.234348366 +0000 UTC m=+504.033226531" Apr 17 20:52:20.312698 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:20.312654 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:52:20.312984 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:20.312764 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:52:21.229149 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:21.229111 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" event={"ID":"90dfe00c-70eb-4a27-bcea-7f82655ba36c","Type":"ContainerStarted","Data":"956a6d2a249d00aa1f9e601530286415eff70acf4cc6fbef2f30cb342a8dbb39"} Apr 17 20:52:21.229437 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:21.229393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:21.231202 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:21.231164 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-vv4x6 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 20:52:21.231341 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:21.231225 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" podUID="90dfe00c-70eb-4a27-bcea-7f82655ba36c" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:52:21.248351 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:21.248300 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" podStartSLOduration=1.8324864459999999 podStartE2EDuration="4.248284198s" podCreationTimestamp="2026-04-17 20:52:17 +0000 UTC" firstStartedPulling="2026-04-17 20:52:17.896570514 +0000 UTC m=+503.695448664" lastFinishedPulling="2026-04-17 20:52:20.312368267 +0000 UTC m=+506.111246416" observedRunningTime="2026-04-17 20:52:21.24615581 +0000 UTC m=+507.045033976" watchObservedRunningTime="2026-04-17 20:52:21.248284198 +0000 UTC m=+507.047162365" Apr 17 20:52:22.232678 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:22.232650 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-vv4x6" Apr 17 20:52:26.209858 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:26.209816 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gb7nr" Apr 17 20:52:29.223837 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:29.223808 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-jc74h" Apr 17 20:52:37.182129 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:52:37.182093 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-wwhfg" Apr 17 20:53:28.721934 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.721895 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm"] Apr 17 20:53:28.725238 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.725215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:28.727295 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.727270 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-mltmf\"" Apr 17 20:53:28.727557 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.727536 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:53:28.727739 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.727712 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:53:28.727845 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.727795 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 20:53:28.735521 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.735496 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm"] Apr 17 20:53:28.810208 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.810182 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4dv\" (UniqueName: \"kubernetes.io/projected/1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0-kube-api-access-6n4dv\") pod \"dns-operator-controller-manager-648d5c98bc-mgrqm\" (UID: \"1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:28.911191 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.911164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4dv\" (UniqueName: \"kubernetes.io/projected/1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0-kube-api-access-6n4dv\") pod \"dns-operator-controller-manager-648d5c98bc-mgrqm\" (UID: \"1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:28.920880 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:28.920852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4dv\" (UniqueName: \"kubernetes.io/projected/1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0-kube-api-access-6n4dv\") pod \"dns-operator-controller-manager-648d5c98bc-mgrqm\" (UID: \"1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:29.036273 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:29.036220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:29.157619 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:29.157581 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm"] Apr 17 20:53:29.160932 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:53:29.160901 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4a7e51_5031_4cd5_9e72_a8a8f309f0d0.slice/crio-930aba4f23bf6965e4d42f18a28832c00fe725193656bbc40e43e3d50dc197b6 WatchSource:0}: Error finding container 930aba4f23bf6965e4d42f18a28832c00fe725193656bbc40e43e3d50dc197b6: Status 404 returned error can't find the container with id 930aba4f23bf6965e4d42f18a28832c00fe725193656bbc40e43e3d50dc197b6 Apr 17 20:53:29.439667 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:29.439628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" event={"ID":"1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0","Type":"ContainerStarted","Data":"930aba4f23bf6965e4d42f18a28832c00fe725193656bbc40e43e3d50dc197b6"} Apr 17 20:53:31.446831 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:31.446798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" event={"ID":"1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0","Type":"ContainerStarted","Data":"ecd1e5ddfc9bcd1251ed15c739a937c2c2ab1f7b11814c2219f8056e72a85a63"} Apr 17 20:53:31.447252 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:31.446856 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:31.463520 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:31.463397 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" podStartSLOduration=1.411531818 podStartE2EDuration="3.463380321s" podCreationTimestamp="2026-04-17 20:53:28 +0000 UTC" firstStartedPulling="2026-04-17 20:53:29.162877946 +0000 UTC m=+574.961756094" lastFinishedPulling="2026-04-17 20:53:31.214726437 +0000 UTC m=+577.013604597" observedRunningTime="2026-04-17 20:53:31.462369828 +0000 UTC m=+577.261247993" watchObservedRunningTime="2026-04-17 20:53:31.463380321 +0000 UTC m=+577.262258527" Apr 17 20:53:33.866133 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.866100 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x"] Apr 17 20:53:33.869270 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.869252 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:33.874048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.874026 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-92nqz\"" Apr 17 20:53:33.881210 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.881181 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x"] Apr 17 20:53:33.948422 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.948397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfl9\" (UniqueName: \"kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:33.948548 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:33.948443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.049308 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.049273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.049487 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.049336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfl9\" (UniqueName: \"kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.049666 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.049647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.059656 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.059633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfl9\" (UniqueName: \"kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.179662 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.179586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:34.310029 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.309994 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x"] Apr 17 20:53:34.312668 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:53:34.312640 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cab96dc_c6f1_40ca_afe7_d181177ae2e0.slice/crio-ffe83e32c118918745beedb0ca5b4c0f3359cecbaa843486966c8d5e2f0bca8d WatchSource:0}: Error finding container ffe83e32c118918745beedb0ca5b4c0f3359cecbaa843486966c8d5e2f0bca8d: Status 404 returned error can't find the container with id ffe83e32c118918745beedb0ca5b4c0f3359cecbaa843486966c8d5e2f0bca8d Apr 17 20:53:34.457703 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:34.457631 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" event={"ID":"0cab96dc-c6f1-40ca-afe7-d181177ae2e0","Type":"ContainerStarted","Data":"ffe83e32c118918745beedb0ca5b4c0f3359cecbaa843486966c8d5e2f0bca8d"} Apr 17 20:53:39.475833 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:39.475803 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" event={"ID":"0cab96dc-c6f1-40ca-afe7-d181177ae2e0","Type":"ContainerStarted","Data":"8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930"} Apr 17 20:53:39.476107 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:39.475913 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:39.492517 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:39.492465 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" podStartSLOduration=1.436383334 podStartE2EDuration="6.492381785s" podCreationTimestamp="2026-04-17 20:53:33 +0000 UTC" firstStartedPulling="2026-04-17 20:53:34.31492802 +0000 UTC m=+580.113806167" lastFinishedPulling="2026-04-17 20:53:39.37092646 +0000 UTC m=+585.169804618" observedRunningTime="2026-04-17 20:53:39.491150104 +0000 UTC m=+585.290028270" watchObservedRunningTime="2026-04-17 20:53:39.492381785 +0000 UTC m=+585.291259955" Apr 17 20:53:42.452214 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:42.452184 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-mgrqm" Apr 17 20:53:50.481476 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:50.481420 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:51.470920 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.470887 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467"] Apr 17 20:53:51.474148 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.474124 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.487857 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.487830 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467"] Apr 17 20:53:51.596179 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.596150 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.596332 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.596195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsslc\" (UniqueName: \"kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.697503 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.697469 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.697630 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.697516 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsslc\" (UniqueName: \"kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.697865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.697841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.705101 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.705082 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsslc\" (UniqueName: \"kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.784229 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.784172 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:51.918163 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:51.918139 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467"] Apr 17 20:53:51.920206 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:53:51.920177 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc653ee53_f8cf_49ad_9c7d_a05ee0c353ed.slice/crio-6d2f093a7c78acf4b8f23cd430ae2158388754e2585bebf4f58ef755c483d98a WatchSource:0}: Error finding container 6d2f093a7c78acf4b8f23cd430ae2158388754e2585bebf4f58ef755c483d98a: Status 404 returned error can't find the container with id 6d2f093a7c78acf4b8f23cd430ae2158388754e2585bebf4f58ef755c483d98a Apr 17 20:53:52.185719 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.185677 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x"] Apr 17 20:53:52.185948 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.185923 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" containerName="manager" containerID="cri-o://8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930" gracePeriod=2 Apr 17 20:53:52.191032 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.191000 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x"] Apr 17 20:53:52.199593 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.199559 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467"] Apr 17 20:53:52.209285 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.209261 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467"] Apr 17 20:53:52.220602 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.220546 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:53:52.220982 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.220958 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" containerName="manager" Apr 17 20:53:52.220982 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.220984 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" containerName="manager" Apr 17 20:53:52.221152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.221003 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" containerName="manager" Apr 17 20:53:52.221152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.221011 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" containerName="manager" Apr 17 20:53:52.221152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.221081 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" containerName="manager" Apr 17 20:53:52.221152 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.221096 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" containerName="manager" Apr 17 20:53:52.223906 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.223875 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.225581 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.225551 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.242675 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.242646 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:53:52.253048 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.253026 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh"] Apr 17 20:53:52.256059 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.256039 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:52.258514 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.258139 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-cm9r2\"" Apr 17 20:53:52.258924 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.258759 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.275045 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.275021 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh"] Apr 17 20:53:52.401693 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.401672 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:52.402928 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.402908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.403002 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.402977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wm5\" (UniqueName: \"kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.403043 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.403016 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvzm\" (UniqueName: \"kubernetes.io/projected/671a159a-da7e-4dcf-9974-da40e971135d-kube-api-access-jwvzm\") pod \"limitador-operator-controller-manager-85c4996f8c-td5vh\" (UID: \"671a159a-da7e-4dcf-9974-da40e971135d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:52.403676 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.403651 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.503470 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503403 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppfl9\" (UniqueName: \"kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9\") pod \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " Apr 17 20:53:52.503820 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503477 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume\") pod \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\" (UID: \"0cab96dc-c6f1-40ca-afe7-d181177ae2e0\") " Apr 17 20:53:52.503820 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wm5\" (UniqueName: \"kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.503820 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503565 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvzm\" (UniqueName: \"kubernetes.io/projected/671a159a-da7e-4dcf-9974-da40e971135d-kube-api-access-jwvzm\") pod \"limitador-operator-controller-manager-85c4996f8c-td5vh\" (UID: \"671a159a-da7e-4dcf-9974-da40e971135d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:52.503820 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.504023 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.503935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.504076 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.504040 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0cab96dc-c6f1-40ca-afe7-d181177ae2e0" (UID: "0cab96dc-c6f1-40ca-afe7-d181177ae2e0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:52.505531 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.505511 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9" (OuterVolumeSpecName: "kube-api-access-ppfl9") pod "0cab96dc-c6f1-40ca-afe7-d181177ae2e0" (UID: "0cab96dc-c6f1-40ca-afe7-d181177ae2e0"). InnerVolumeSpecName "kube-api-access-ppfl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:52.512006 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.511982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wm5\" (UniqueName: \"kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hh9st\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.512132 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.512110 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvzm\" (UniqueName: \"kubernetes.io/projected/671a159a-da7e-4dcf-9974-da40e971135d-kube-api-access-jwvzm\") pod \"limitador-operator-controller-manager-85c4996f8c-td5vh\" (UID: \"671a159a-da7e-4dcf-9974-da40e971135d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:52.523264 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:53:52.523244 2567 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" container="manager" Apr 17 20:53:52.524209 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.524189 2567 generic.go:358] "Generic (PLEG): container finished" podID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" containerID="8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930" exitCode=0 Apr 17 20:53:52.524303 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.524234 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" Apr 17 20:53:52.524303 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.524270 2567 scope.go:117] "RemoveContainer" containerID="8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930" Apr 17 20:53:52.525089 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.525051 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.526637 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.526609 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.528288 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.528259 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.529923 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.529900 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.533561 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.533515 2567 scope.go:117] "RemoveContainer" containerID="8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930" Apr 17 20:53:52.533804 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:53:52.533774 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930\": container with ID starting with 8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930 not found: ID does not exist" containerID="8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930" Apr 17 20:53:52.533870 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.533801 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930"} err="failed to get container status \"8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930\": rpc error: code = NotFound desc = could not find container \"8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930\": container with ID starting with 8898ba98c996df750c45cc6c18fbcc45903f103a946dfc7fb5c6c0e93b380930 not found: ID does not exist" Apr 17 20:53:52.533914 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.533890 2567 status_manager.go:895] "Failed to get status for pod" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dmp6x" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dmp6x\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.535280 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.535257 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:52.563720 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.563691 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:52.569402 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.569380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:52.604394 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.604356 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-extensions-socket-volume\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:53:52.604394 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.604398 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ppfl9\" (UniqueName: \"kubernetes.io/projected/0cab96dc-c6f1-40ca-afe7-d181177ae2e0-kube-api-access-ppfl9\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:53:52.697426 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.697390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:53:52.701238 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:53:52.701209 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d2bce8_dcbb_48d5_ae66_cf8c778b4469.slice/crio-26fa2184dec31318ebd471ee36dd9e80903c29903d8784c560b17fbc5513f746 WatchSource:0}: Error finding container 26fa2184dec31318ebd471ee36dd9e80903c29903d8784c560b17fbc5513f746: Status 404 returned error can't find the container with id 26fa2184dec31318ebd471ee36dd9e80903c29903d8784c560b17fbc5513f746 Apr 17 20:53:52.712185 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.712155 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cab96dc-c6f1-40ca-afe7-d181177ae2e0" path="/var/lib/kubelet/pods/0cab96dc-c6f1-40ca-afe7-d181177ae2e0/volumes" Apr 17 20:53:52.712539 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:52.712520 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh"] Apr 17 20:53:52.713127 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:53:52.713107 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671a159a_da7e_4dcf_9974_da40e971135d.slice/crio-d2c6515077c6aaad5556bc39890d24bf13d6c1543c7be0a4a8292e848dfa9a0d WatchSource:0}: Error finding container d2c6515077c6aaad5556bc39890d24bf13d6c1543c7be0a4a8292e848dfa9a0d: Status 404 returned error can't find the container with id d2c6515077c6aaad5556bc39890d24bf13d6c1543c7be0a4a8292e848dfa9a0d Apr 17 20:53:53.530112 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.530073 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" event={"ID":"671a159a-da7e-4dcf-9974-da40e971135d","Type":"ContainerStarted","Data":"d2c6515077c6aaad5556bc39890d24bf13d6c1543c7be0a4a8292e848dfa9a0d"} Apr 17 20:53:53.531431 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.531397 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" containerName="manager" containerID="cri-o://c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3" gracePeriod=2 Apr 17 20:53:53.532750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.532724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" event={"ID":"50d2bce8-dcbb-48d5-ae66-cf8c778b4469","Type":"ContainerStarted","Data":"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa"} Apr 17 20:53:53.532750 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.532751 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" event={"ID":"50d2bce8-dcbb-48d5-ae66-cf8c778b4469","Type":"ContainerStarted","Data":"26fa2184dec31318ebd471ee36dd9e80903c29903d8784c560b17fbc5513f746"} Apr 17 20:53:53.532911 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.532834 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:53:53.567909 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.567868 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" podStartSLOduration=1.567855021 podStartE2EDuration="1.567855021s" podCreationTimestamp="2026-04-17 20:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:53:53.566107516 +0000 UTC m=+599.364985684" watchObservedRunningTime="2026-04-17 20:53:53.567855021 +0000 UTC m=+599.366733187" Apr 17 20:53:53.771005 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.770982 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:53.772698 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.772674 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:53.914620 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.914589 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume\") pod \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " Apr 17 20:53:53.914795 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.914637 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsslc\" (UniqueName: \"kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc\") pod \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\" (UID: \"c653ee53-f8cf-49ad-9c7d-a05ee0c353ed\") " Apr 17 20:53:53.915025 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.915001 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" (UID: "c653ee53-f8cf-49ad-9c7d-a05ee0c353ed"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:53:53.916908 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:53.916878 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc" (OuterVolumeSpecName: "kube-api-access-hsslc") pod "c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" (UID: "c653ee53-f8cf-49ad-9c7d-a05ee0c353ed"). InnerVolumeSpecName "kube-api-access-hsslc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:53:54.015706 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.015670 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-extensions-socket-volume\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:53:54.015706 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.015709 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsslc\" (UniqueName: \"kubernetes.io/projected/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed-kube-api-access-hsslc\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:53:54.537259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.537166 2567 generic.go:358] "Generic (PLEG): container finished" podID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" containerID="c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3" exitCode=0 Apr 17 20:53:54.537259 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.537231 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" Apr 17 20:53:54.537764 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.537278 2567 scope.go:117] "RemoveContainer" containerID="c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3" Apr 17 20:53:54.538865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.538838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" event={"ID":"671a159a-da7e-4dcf-9974-da40e971135d","Type":"ContainerStarted","Data":"389a3e6127b0bfdec597a5c6cf50f8d01876a0e57f348a74e411f8b0413c18dc"} Apr 17 20:53:54.539121 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.539103 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:53:54.539315 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.539294 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:54.540854 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.540831 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:54.545823 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.545805 2567 scope.go:117] "RemoveContainer" containerID="c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3" Apr 17 20:53:54.546076 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:53:54.546052 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3\": container with ID starting with c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3 not found: ID does not exist" containerID="c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3" Apr 17 20:53:54.546151 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.546076 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3"} err="failed to get container status \"c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3\": rpc error: code = NotFound desc = could not find container \"c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3\": container with ID starting with c18e1e9e9988a9a2c9d763e0bf9366170e74736836c745558a73867d161aacb3 not found: ID does not exist" Apr 17 20:53:54.561731 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.561692 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" podStartSLOduration=1.088198559 podStartE2EDuration="2.561681104s" podCreationTimestamp="2026-04-17 20:53:52 +0000 UTC" firstStartedPulling="2026-04-17 20:53:52.716989296 +0000 UTC m=+598.515867444" lastFinishedPulling="2026-04-17 20:53:54.190471834 +0000 UTC m=+599.989349989" observedRunningTime="2026-04-17 20:53:54.56008852 +0000 UTC m=+600.358966686" watchObservedRunningTime="2026-04-17 20:53:54.561681104 +0000 UTC m=+600.360559269" Apr 17 20:53:54.562713 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.562685 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:54.711361 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.711331 2567 status_manager.go:895] "Failed to get status for pod" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pt467" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pt467\" is forbidden: User \"system:node:ip-10-0-139-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-255.ec2.internal' and this object" Apr 17 20:53:54.713062 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:53:54.713035 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c653ee53-f8cf-49ad-9c7d-a05ee0c353ed" path="/var/lib/kubelet/pods/c653ee53-f8cf-49ad-9c7d-a05ee0c353ed/volumes" Apr 17 20:54:04.541051 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.541020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:54:04.911081 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.911045 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74"] Apr 17 20:54:04.916685 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.916665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:04.925605 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.925579 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74"] Apr 17 20:54:04.999816 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.999782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/250066ba-3bef-486f-a1cc-5cc613ccfbbd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:04.999953 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:04.999841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltvl\" (UniqueName: \"kubernetes.io/projected/250066ba-3bef-486f-a1cc-5cc613ccfbbd-kube-api-access-jltvl\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.100803 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.100771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jltvl\" (UniqueName: \"kubernetes.io/projected/250066ba-3bef-486f-a1cc-5cc613ccfbbd-kube-api-access-jltvl\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.100942 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.100869 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/250066ba-3bef-486f-a1cc-5cc613ccfbbd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.101254 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.101236 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/250066ba-3bef-486f-a1cc-5cc613ccfbbd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.108019 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.107994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltvl\" (UniqueName: \"kubernetes.io/projected/250066ba-3bef-486f-a1cc-5cc613ccfbbd-kube-api-access-jltvl\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wrx74\" (UID: \"250066ba-3bef-486f-a1cc-5cc613ccfbbd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.227056 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.226999 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:05.544542 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.544470 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-td5vh" Apr 17 20:54:05.550793 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.550760 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74"] Apr 17 20:54:05.554481 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:54:05.554437 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250066ba_3bef_486f_a1cc_5cc613ccfbbd.slice/crio-56e97347020c71d403d5f9497f6945499035ca9a5740e6407eda182646e1f4ca WatchSource:0}: Error finding container 56e97347020c71d403d5f9497f6945499035ca9a5740e6407eda182646e1f4ca: Status 404 returned error can't find the container with id 56e97347020c71d403d5f9497f6945499035ca9a5740e6407eda182646e1f4ca Apr 17 20:54:05.575154 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:05.575127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" event={"ID":"250066ba-3bef-486f-a1cc-5cc613ccfbbd","Type":"ContainerStarted","Data":"56e97347020c71d403d5f9497f6945499035ca9a5740e6407eda182646e1f4ca"} Apr 17 20:54:06.579412 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:06.579372 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" event={"ID":"250066ba-3bef-486f-a1cc-5cc613ccfbbd","Type":"ContainerStarted","Data":"accea7cda3eb0124265c1a0ba55ea47d9de8d65fdbede065b53fd250b301fe12"} Apr 17 20:54:06.579788 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:06.579588 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:06.601345 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:06.601274 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" podStartSLOduration=2.601257579 podStartE2EDuration="2.601257579s" podCreationTimestamp="2026-04-17 20:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:54:06.599605889 +0000 UTC m=+612.398484055" watchObservedRunningTime="2026-04-17 20:54:06.601257579 +0000 UTC m=+612.400135747" Apr 17 20:54:17.586145 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.586114 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wrx74" Apr 17 20:54:17.626005 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.625975 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:54:17.626281 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.626256 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" podUID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" containerName="manager" containerID="cri-o://ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa" gracePeriod=10 Apr 17 20:54:17.859878 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.859857 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:54:17.890751 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.890725 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8wm5\" (UniqueName: \"kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5\") pod \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " Apr 17 20:54:17.890891 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.890788 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume\") pod \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\" (UID: \"50d2bce8-dcbb-48d5-ae66-cf8c778b4469\") " Apr 17 20:54:17.891116 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.891087 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "50d2bce8-dcbb-48d5-ae66-cf8c778b4469" (UID: "50d2bce8-dcbb-48d5-ae66-cf8c778b4469"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:17.892851 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.892825 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5" (OuterVolumeSpecName: "kube-api-access-h8wm5") pod "50d2bce8-dcbb-48d5-ae66-cf8c778b4469" (UID: "50d2bce8-dcbb-48d5-ae66-cf8c778b4469"). InnerVolumeSpecName "kube-api-access-h8wm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:17.992011 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.991991 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-extensions-socket-volume\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:54:17.992011 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:17.992010 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8wm5\" (UniqueName: \"kubernetes.io/projected/50d2bce8-dcbb-48d5-ae66-cf8c778b4469-kube-api-access-h8wm5\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:54:18.624200 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.624167 2567 generic.go:358] "Generic (PLEG): container finished" podID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" containerID="ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa" exitCode=0 Apr 17 20:54:18.624655 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.624227 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" Apr 17 20:54:18.624655 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.624255 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" event={"ID":"50d2bce8-dcbb-48d5-ae66-cf8c778b4469","Type":"ContainerDied","Data":"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa"} Apr 17 20:54:18.624655 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.624292 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st" event={"ID":"50d2bce8-dcbb-48d5-ae66-cf8c778b4469","Type":"ContainerDied","Data":"26fa2184dec31318ebd471ee36dd9e80903c29903d8784c560b17fbc5513f746"} Apr 17 20:54:18.624655 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.624307 2567 scope.go:117] "RemoveContainer" containerID="ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa" Apr 17 20:54:18.632911 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.632578 2567 scope.go:117] "RemoveContainer" containerID="ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa" Apr 17 20:54:18.632911 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:54:18.632900 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa\": container with ID starting with ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa not found: ID does not exist" containerID="ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa" Apr 17 20:54:18.633013 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.632929 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa"} err="failed to get container status \"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa\": rpc error: code = NotFound desc = could not find container \"ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa\": container with ID starting with ac4f67b70aaf5b707c0a71cd766b912ba0886116df0f54244cd5c891fe40b4fa not found: ID does not exist" Apr 17 20:54:18.647117 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.647091 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:54:18.650465 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.650431 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hh9st"] Apr 17 20:54:18.711984 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:18.711962 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" path="/var/lib/kubelet/pods/50d2bce8-dcbb-48d5-ae66-cf8c778b4469/volumes" Apr 17 20:54:36.407400 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.407361 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:54:36.407865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.407669 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" containerName="manager" Apr 17 20:54:36.407865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.407680 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" containerName="manager" Apr 17 20:54:36.407865 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.407732 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="50d2bce8-dcbb-48d5-ae66-cf8c778b4469" containerName="manager" Apr 17 20:54:36.410466 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.410431 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:54:36.412208 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.412181 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-9jg8g\"" Apr 17 20:54:36.415294 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.415267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:54:36.536643 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.536599 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xfd\" (UniqueName: \"kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd\") pod \"authorino-7498df8756-zqdt7\" (UID: \"3047ed68-4254-4783-abc4-75c02964e7d2\") " pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:54:36.637912 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.637889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xfd\" (UniqueName: \"kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd\") pod \"authorino-7498df8756-zqdt7\" (UID: \"3047ed68-4254-4783-abc4-75c02964e7d2\") " pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:54:36.644911 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.644894 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xfd\" (UniqueName: \"kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd\") pod \"authorino-7498df8756-zqdt7\" (UID: \"3047ed68-4254-4783-abc4-75c02964e7d2\") " pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:54:36.720848 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.720790 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:54:36.837476 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:36.837429 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:54:36.841209 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:54:36.841181 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3047ed68_4254_4783_abc4_75c02964e7d2.slice/crio-bf100592d72ecaac3ea32bb713722d66724591d0bd3f0044b6e4334b7779b696 WatchSource:0}: Error finding container bf100592d72ecaac3ea32bb713722d66724591d0bd3f0044b6e4334b7779b696: Status 404 returned error can't find the container with id bf100592d72ecaac3ea32bb713722d66724591d0bd3f0044b6e4334b7779b696 Apr 17 20:54:37.687233 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:37.687124 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-zqdt7" event={"ID":"3047ed68-4254-4783-abc4-75c02964e7d2","Type":"ContainerStarted","Data":"bf100592d72ecaac3ea32bb713722d66724591d0bd3f0044b6e4334b7779b696"} Apr 17 20:54:40.700384 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:40.700340 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-zqdt7" event={"ID":"3047ed68-4254-4783-abc4-75c02964e7d2","Type":"ContainerStarted","Data":"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196"} Apr 17 20:54:40.713862 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:54:40.713815 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-zqdt7" podStartSLOduration=1.770852673 podStartE2EDuration="4.713802059s" podCreationTimestamp="2026-04-17 20:54:36 +0000 UTC" firstStartedPulling="2026-04-17 20:54:36.842489837 +0000 UTC m=+642.641367981" lastFinishedPulling="2026-04-17 20:54:39.785439219 +0000 UTC m=+645.584317367" observedRunningTime="2026-04-17 20:54:40.711923917 +0000 UTC m=+646.510802083" watchObservedRunningTime="2026-04-17 20:54:40.713802059 +0000 UTC m=+646.512680222" Apr 17 20:55:10.191326 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.191286 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:55:10.191957 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.191510 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-zqdt7" podUID="3047ed68-4254-4783-abc4-75c02964e7d2" containerName="authorino" containerID="cri-o://dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196" gracePeriod=30 Apr 17 20:55:10.428321 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.428301 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:55:10.584834 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.584805 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6xfd\" (UniqueName: \"kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd\") pod \"3047ed68-4254-4783-abc4-75c02964e7d2\" (UID: \"3047ed68-4254-4783-abc4-75c02964e7d2\") " Apr 17 20:55:10.586860 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.586830 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd" (OuterVolumeSpecName: "kube-api-access-z6xfd") pod "3047ed68-4254-4783-abc4-75c02964e7d2" (UID: "3047ed68-4254-4783-abc4-75c02964e7d2"). InnerVolumeSpecName "kube-api-access-z6xfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:55:10.686314 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.686286 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6xfd\" (UniqueName: \"kubernetes.io/projected/3047ed68-4254-4783-abc4-75c02964e7d2-kube-api-access-z6xfd\") on node \"ip-10-0-139-255.ec2.internal\" DevicePath \"\"" Apr 17 20:55:10.802237 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.802203 2567 generic.go:358] "Generic (PLEG): container finished" podID="3047ed68-4254-4783-abc4-75c02964e7d2" containerID="dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196" exitCode=0 Apr 17 20:55:10.802378 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.802252 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-zqdt7" Apr 17 20:55:10.802378 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.802275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-zqdt7" event={"ID":"3047ed68-4254-4783-abc4-75c02964e7d2","Type":"ContainerDied","Data":"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196"} Apr 17 20:55:10.802378 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.802302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-zqdt7" event={"ID":"3047ed68-4254-4783-abc4-75c02964e7d2","Type":"ContainerDied","Data":"bf100592d72ecaac3ea32bb713722d66724591d0bd3f0044b6e4334b7779b696"} Apr 17 20:55:10.802378 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.802318 2567 scope.go:117] "RemoveContainer" containerID="dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196" Apr 17 20:55:10.810211 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.810186 2567 scope.go:117] "RemoveContainer" containerID="dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196" Apr 17 20:55:10.810477 ip-10-0-139-255 kubenswrapper[2567]: E0417 20:55:10.810443 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196\": container with ID starting with dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196 not found: ID does not exist" containerID="dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196" Apr 17 20:55:10.810630 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.810481 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196"} err="failed to get container status \"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196\": rpc error: code = NotFound desc = could not find container \"dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196\": container with ID starting with dde63c8c665c00c19669a454f5c0449278230d33dabbaf2ec5578c2881764196 not found: ID does not exist" Apr 17 20:55:10.817969 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.817945 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:55:10.821719 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:10.821699 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-zqdt7"] Apr 17 20:55:12.712170 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:12.712131 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3047ed68-4254-4783-abc4-75c02964e7d2" path="/var/lib/kubelet/pods/3047ed68-4254-4783-abc4-75c02964e7d2/volumes" Apr 17 20:55:26.533955 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.533918 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-78f96b4b44-wf2lc"] Apr 17 20:55:26.534384 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.534201 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3047ed68-4254-4783-abc4-75c02964e7d2" containerName="authorino" Apr 17 20:55:26.534384 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.534212 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047ed68-4254-4783-abc4-75c02964e7d2" containerName="authorino" Apr 17 20:55:26.534384 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.534267 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3047ed68-4254-4783-abc4-75c02964e7d2" containerName="authorino" Apr 17 20:55:26.538373 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.538357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:26.540115 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.540087 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-d94fj\"" Apr 17 20:55:26.546509 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.546488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78f96b4b44-wf2lc"] Apr 17 20:55:26.700442 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.700397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vszx9\" (UniqueName: \"kubernetes.io/projected/91c5e488-5246-4633-9931-60a92ee29cee-kube-api-access-vszx9\") pod \"maas-controller-78f96b4b44-wf2lc\" (UID: \"91c5e488-5246-4633-9931-60a92ee29cee\") " pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:26.801197 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.801127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vszx9\" (UniqueName: \"kubernetes.io/projected/91c5e488-5246-4633-9931-60a92ee29cee-kube-api-access-vszx9\") pod \"maas-controller-78f96b4b44-wf2lc\" (UID: \"91c5e488-5246-4633-9931-60a92ee29cee\") " pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:26.808247 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.808215 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vszx9\" (UniqueName: \"kubernetes.io/projected/91c5e488-5246-4633-9931-60a92ee29cee-kube-api-access-vszx9\") pod \"maas-controller-78f96b4b44-wf2lc\" (UID: \"91c5e488-5246-4633-9931-60a92ee29cee\") " pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:26.849226 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.849203 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:26.962147 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:26.962112 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-78f96b4b44-wf2lc"] Apr 17 20:55:26.965082 ip-10-0-139-255 kubenswrapper[2567]: W0417 20:55:26.965057 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c5e488_5246_4633_9931_60a92ee29cee.slice/crio-52a352b1e69fc47d190dd573089f39d7154f1a28e239f26fcb3adff473a00cd6 WatchSource:0}: Error finding container 52a352b1e69fc47d190dd573089f39d7154f1a28e239f26fcb3adff473a00cd6: Status 404 returned error can't find the container with id 52a352b1e69fc47d190dd573089f39d7154f1a28e239f26fcb3adff473a00cd6 Apr 17 20:55:27.859521 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:27.859482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" event={"ID":"91c5e488-5246-4633-9931-60a92ee29cee","Type":"ContainerStarted","Data":"52a352b1e69fc47d190dd573089f39d7154f1a28e239f26fcb3adff473a00cd6"} Apr 17 20:55:29.867756 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:29.867716 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" event={"ID":"91c5e488-5246-4633-9931-60a92ee29cee","Type":"ContainerStarted","Data":"15a75bad8565d0f043f6f6749828b175b1bf7e75119268fff0802446b0682dae"} Apr 17 20:55:29.868176 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:29.867846 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 20:55:29.881857 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:29.881809 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" podStartSLOduration=1.8464299990000002 podStartE2EDuration="3.881796796s" podCreationTimestamp="2026-04-17 20:55:26 +0000 UTC" firstStartedPulling="2026-04-17 20:55:26.966311071 +0000 UTC m=+692.765189217" lastFinishedPulling="2026-04-17 20:55:29.001677869 +0000 UTC m=+694.800556014" observedRunningTime="2026-04-17 20:55:29.880230106 +0000 UTC m=+695.679108274" watchObservedRunningTime="2026-04-17 20:55:29.881796796 +0000 UTC m=+695.680674963" Apr 17 20:55:40.875935 ip-10-0-139-255 kubenswrapper[2567]: I0417 20:55:40.875312 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-78f96b4b44-wf2lc" Apr 17 21:05:51.356178 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:51.356151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-wwhfg_418163da-535c-405a-bff0-684797bf68e4/manager/0.log" Apr 17 21:05:51.465785 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:51.465766 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-78f96b4b44-wf2lc_91c5e488-5246-4633-9931-60a92ee29cee/manager/0.log" Apr 17 21:05:51.577566 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:51.577529 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jc74h_37ba6f55-8094-4ab4-b24b-6330f49f9c07/manager/2.log" Apr 17 21:05:51.815843 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:51.815782 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6dc4849f89-ndxfz_e113327f-ec56-4733-9105-3369eb3947b4/manager/0.log" Apr 17 21:05:53.506697 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:53.506663 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-mgrqm_1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0/manager/0.log" Apr 17 21:05:53.850040 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:53.850015 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-wrx74_250066ba-3bef-486f-a1cc-5cc613ccfbbd/manager/0.log" Apr 17 21:05:54.066396 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:54.066370 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-td5vh_671a159a-da7e-4dcf-9974-da40e971135d/manager/0.log" Apr 17 21:05:54.508604 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:54.508566 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vv4x6_90dfe00c-70eb-4a27-bcea-7f82655ba36c/discovery/0.log" Apr 17 21:05:54.721012 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:05:54.720982 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-674746b5f4-g87sv_df44c2b0-79cb-48a8-95fd-7c810d41b946/kube-auth-proxy/0.log" Apr 17 21:06:01.306445 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:01.306412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4qt6s_8eeefe2c-274e-4ea9-a2c7-594d5fd9126f/global-pull-secret-syncer/0.log" Apr 17 21:06:01.465852 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:01.465827 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rjk22_1f54a916-b3e7-4361-b0c6-0ec7db5c31e6/konnectivity-agent/0.log" Apr 17 21:06:01.526605 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:01.526574 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-255.ec2.internal_e250693f0814c5dff374e113e490f4a6/haproxy/0.log" Apr 17 21:06:05.476523 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:05.476493 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-mgrqm_1c4a7e51-5031-4cd5-9e72-a8a8f309f0d0/manager/0.log" Apr 17 21:06:05.588567 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:05.588540 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-wrx74_250066ba-3bef-486f-a1cc-5cc613ccfbbd/manager/0.log" Apr 17 21:06:05.659468 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:05.659430 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-td5vh_671a159a-da7e-4dcf-9974-da40e971135d/manager/0.log" Apr 17 21:06:07.414336 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:07.414310 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdk4r_fac65c87-d203-48d3-8dd0-754aef117237/node-exporter/0.log" Apr 17 21:06:07.438758 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:07.438719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdk4r_fac65c87-d203-48d3-8dd0-754aef117237/kube-rbac-proxy/0.log" Apr 17 21:06:07.459655 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:07.459630 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wdk4r_fac65c87-d203-48d3-8dd0-754aef117237/init-textfile/0.log" Apr 17 21:06:09.220906 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.220834 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-nd22j_23b78edd-8569-4781-bf46-bc649a833595/networking-console-plugin/0.log" Apr 17 21:06:09.905863 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.905830 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n"] Apr 17 21:06:09.909114 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.909092 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:09.911044 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.911023 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znlkx\"/\"kube-root-ca.crt\"" Apr 17 21:06:09.911162 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.911048 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-znlkx\"/\"default-dockercfg-9snlf\"" Apr 17 21:06:09.911162 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.911022 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-znlkx\"/\"openshift-service-ca.crt\"" Apr 17 21:06:09.916275 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916254 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n"] Apr 17 21:06:09.916543 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-sys\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:09.916600 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-podres\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:09.916643 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-lib-modules\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:09.916643 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfwgs\" (UniqueName: \"kubernetes.io/projected/2a07477d-9552-429a-b5e3-3eb2f236e667-kube-api-access-kfwgs\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:09.916762 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:09.916745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-proc\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017066 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-podres\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017184 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-lib-modules\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017184 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfwgs\" (UniqueName: \"kubernetes.io/projected/2a07477d-9552-429a-b5e3-3eb2f236e667-kube-api-access-kfwgs\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017184 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017129 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-proc\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017184 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-sys\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017349 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-podres\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017349 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-lib-modules\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017349 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-sys\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.017349 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.017252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2a07477d-9552-429a-b5e3-3eb2f236e667-proc\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.024081 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.024061 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfwgs\" (UniqueName: \"kubernetes.io/projected/2a07477d-9552-429a-b5e3-3eb2f236e667-kube-api-access-kfwgs\") pod \"perf-node-gather-daemonset-4s59n\" (UID: \"2a07477d-9552-429a-b5e3-3eb2f236e667\") " pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.219219 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.219146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.337119 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.337093 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n"] Apr 17 21:06:10.339227 ip-10-0-139-255 kubenswrapper[2567]: W0417 21:06:10.339198 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2a07477d_9552_429a_b5e3_3eb2f236e667.slice/crio-6fda7f26e951a78790c76a4030caa687e2fb9aa052883229675a39ceab9646f2 WatchSource:0}: Error finding container 6fda7f26e951a78790c76a4030caa687e2fb9aa052883229675a39ceab9646f2: Status 404 returned error can't find the container with id 6fda7f26e951a78790c76a4030caa687e2fb9aa052883229675a39ceab9646f2 Apr 17 21:06:10.340757 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.340741 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:06:10.938222 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.938183 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" event={"ID":"2a07477d-9552-429a-b5e3-3eb2f236e667","Type":"ContainerStarted","Data":"1912899d9b6c6f0c0c0c8207abc27aff0c13e822c260e7c3878eb531f97dbbb8"} Apr 17 21:06:10.938222 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.938222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" event={"ID":"2a07477d-9552-429a-b5e3-3eb2f236e667","Type":"ContainerStarted","Data":"6fda7f26e951a78790c76a4030caa687e2fb9aa052883229675a39ceab9646f2"} Apr 17 21:06:10.938438 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.938246 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:10.952623 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:10.952578 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" podStartSLOduration=1.952564019 podStartE2EDuration="1.952564019s" podCreationTimestamp="2026-04-17 21:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:06:10.950988598 +0000 UTC m=+1336.749866764" watchObservedRunningTime="2026-04-17 21:06:10.952564019 +0000 UTC m=+1336.751442184" Apr 17 21:06:11.553228 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:11.553196 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qf8m5_e29bb1be-edc2-47b7-8269-a7ceb57323f1/dns/0.log" Apr 17 21:06:11.571343 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:11.571317 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qf8m5_e29bb1be-edc2-47b7-8269-a7ceb57323f1/kube-rbac-proxy/0.log" Apr 17 21:06:11.636792 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:11.636761 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bcf4q_0ad25b90-ed3e-4976-b701-b30fbe6881cd/dns-node-resolver/0.log" Apr 17 21:06:12.114495 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:12.114469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fmmlt_71ae2eb0-2562-4952-a3e2-66786045ebd7/node-ca/0.log" Apr 17 21:06:12.941121 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:12.941085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-vv4x6_90dfe00c-70eb-4a27-bcea-7f82655ba36c/discovery/0.log" Apr 17 21:06:12.979796 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:12.979773 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-674746b5f4-g87sv_df44c2b0-79cb-48a8-95fd-7c810d41b946/kube-auth-proxy/0.log" Apr 17 21:06:13.524685 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:13.524661 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jrhbz_d7a05ede-324e-4207-a7a9-c301663390b7/serve-healthcheck-canary/0.log" Apr 17 21:06:13.975783 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:13.975752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nqw6n_71b8cadb-5b6a-4cfd-b79f-08bef397fb44/kube-rbac-proxy/0.log" Apr 17 21:06:13.998562 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:13.998536 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nqw6n_71b8cadb-5b6a-4cfd-b79f-08bef397fb44/exporter/0.log" Apr 17 21:06:14.018718 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:14.018701 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nqw6n_71b8cadb-5b6a-4cfd-b79f-08bef397fb44/extractor/0.log" Apr 17 21:06:15.890953 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:15.890926 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-wwhfg_418163da-535c-405a-bff0-684797bf68e4/manager/0.log" Apr 17 21:06:15.912202 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:15.912176 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-78f96b4b44-wf2lc_91c5e488-5246-4633-9931-60a92ee29cee/manager/0.log" Apr 17 21:06:15.933656 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:15.933632 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jc74h_37ba6f55-8094-4ab4-b24b-6330f49f9c07/manager/1.log" Apr 17 21:06:15.937615 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:15.937597 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jc74h_37ba6f55-8094-4ab4-b24b-6330f49f9c07/manager/2.log" Apr 17 21:06:15.984502 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:15.984485 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6dc4849f89-ndxfz_e113327f-ec56-4733-9105-3369eb3947b4/manager/0.log" Apr 17 21:06:16.951507 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:16.951483 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-znlkx/perf-node-gather-daemonset-4s59n" Apr 17 21:06:17.119844 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:17.119817 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-m5xff_dccb5093-e348-4b95-9ba7-55b96602e17f/openshift-lws-operator/0.log" Apr 17 21:06:22.620125 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.620094 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2bn4l_95bd04a6-fb3b-498b-bf3e-7b047bad740d/kube-multus/0.log" Apr 17 21:06:22.833632 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.833593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/kube-multus-additional-cni-plugins/0.log" Apr 17 21:06:22.852140 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.852115 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/egress-router-binary-copy/0.log" Apr 17 21:06:22.870651 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.870590 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/cni-plugins/0.log" Apr 17 21:06:22.889400 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.889382 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/bond-cni-plugin/0.log" Apr 17 21:06:22.910773 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.910754 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/routeoverride-cni/0.log" Apr 17 21:06:22.929378 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.929361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/whereabouts-cni-bincopy/0.log" Apr 17 21:06:22.947519 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:22.947502 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tz6kr_2aad12b0-2520-4cf5-bc30-a332be05db03/whereabouts-cni/0.log" Apr 17 21:06:23.222348 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:23.222270 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mxwcv_b84b134c-9465-48d2-b811-36203ae88de2/network-metrics-daemon/0.log" Apr 17 21:06:23.240763 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:23.240741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mxwcv_b84b134c-9465-48d2-b811-36203ae88de2/kube-rbac-proxy/0.log" Apr 17 21:06:24.329498 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.329466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/ovn-controller/0.log" Apr 17 21:06:24.351560 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.351535 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/ovn-acl-logging/0.log" Apr 17 21:06:24.370092 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.370072 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/kube-rbac-proxy-node/0.log" Apr 17 21:06:24.387826 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.387807 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:06:24.404547 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.404531 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/northd/0.log" Apr 17 21:06:24.423919 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.423906 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/nbdb/0.log" Apr 17 21:06:24.442135 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.442118 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/sbdb/0.log" Apr 17 21:06:24.528720 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:24.528699 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fdrzh_a193cfd6-995e-4072-a6e1-26f3f8ca3a85/ovnkube-controller/0.log" Apr 17 21:06:25.889076 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:25.889034 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mpmw8_549959be-8acc-4beb-914c-74b089e36128/network-check-target-container/0.log" Apr 17 21:06:26.840168 ip-10-0-139-255 kubenswrapper[2567]: I0417 21:06:26.840138 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-knzfb_47702967-5f03-40a5-b1ae-9f6930a86290/iptables-alerter/0.log"