Apr 17 14:23:34.758341 ip-10-0-130-190 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:23:34.758353 ip-10-0-130-190 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:23:34.758363 ip-10-0-130-190 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:23:34.758675 ip-10-0-130-190 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:23:45.006187 ip-10-0-130-190 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:23:45.006202 ip-10-0-130-190 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7f94008ec1bb4e0da33670f202f463e1 -- Apr 17 14:26:04.897705 ip-10-0-130-190 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:26:05.330938 ip-10-0-130-190 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:05.330938 ip-10-0-130-190 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:26:05.330938 ip-10-0-130-190 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:05.330938 ip-10-0-130-190 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:26:05.330938 ip-10-0-130-190 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:05.331779 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.330998 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:26:05.333993 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.333978 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:05.333993 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.333992 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.333996 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334000 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334003 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334006 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334009 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334012 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334015 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334019 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334022 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334025 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334027 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334030 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334032 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334034 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334037 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334039 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334042 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334045 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334047 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:05.334058 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334050 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334052 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334055 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334058 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334060 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334063 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334066 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334069 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334072 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334074 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334077 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334079 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334082 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334085 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334088 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334090 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334094 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334098 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334101 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:05.334532 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334104 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334106 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334109 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334111 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334114 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334116 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334118 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334121 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334124 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334127 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334129 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334132 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334134 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334136 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334139 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334142 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334145 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334148 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334150 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334153 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:05.334990 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334158 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334162 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334165 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334167 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334170 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334172 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334175 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334177 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334180 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334183 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334186 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334188 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334192 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334194 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334197 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334199 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334202 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334204 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334207 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334209 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:05.335504 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334212 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334215 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334217 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334220 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334222 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334225 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334611 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334616 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334619 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334621 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334624 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334626 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334629 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334632 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334634 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334636 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334639 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334641 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334644 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334646 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:05.336000 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334649 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334652 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334654 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334656 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334659 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334662 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334664 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334667 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334669 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334672 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334674 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334676 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334679 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334681 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334684 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334687 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334690 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334692 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334695 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:05.336853 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334699 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334702 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334704 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334707 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334710 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334712 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334715 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334718 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334721 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334723 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334726 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334728 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334731 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334733 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334737 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334739 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334742 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334744 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334747 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334749 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:05.337421 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334751 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334754 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334756 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334758 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334761 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334763 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334767 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334769 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334772 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334775 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334777 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334780 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334782 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334785 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334787 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334789 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334792 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334795 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334798 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:05.337978 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334800 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334802 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334805 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334807 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334809 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334812 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334815 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334818 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334821 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334824 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334827 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334829 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334831 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.334834 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335683 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335691 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335698 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335702 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335707 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335711 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335715 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:26:05.338426 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335720 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335723 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335726 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335729 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335732 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335735 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335738 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335741 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335744 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335747 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335750 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335752 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335756 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335758 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335762 2572 flags.go:64] FLAG: --config-dir="" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335764 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335768 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335772 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335775 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335778 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335784 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335787 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335790 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335793 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335796 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:26:05.338944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335798 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335802 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335806 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335809 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335812 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335815 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335818 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335822 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335825 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335828 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335831 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335834 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335838 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335841 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335844 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335847 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335850 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335853 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335856 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335858 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335861 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335864 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335867 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335871 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335874 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:26:05.339535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335877 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335879 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335884 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335924 2572 flags.go:64] FLAG: --help="false" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335959 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335965 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335970 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335976 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335980 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335984 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335991 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335994 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.335997 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336005 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336008 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336012 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336027 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336032 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336036 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336042 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336564 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336588 2572 flags.go:64] FLAG: --lock-file="" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336596 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336602 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:26:05.340137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336609 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336644 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336649 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336655 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336683 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336688 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336695 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336700 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336705 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336713 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336718 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336725 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336730 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336740 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336745 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336750 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336755 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336760 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336765 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336779 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336789 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336794 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336800 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:26:05.340736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336804 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336813 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336818 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336823 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336828 2572 flags.go:64] FLAG: --port="10250" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336837 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336843 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ea76c940334befe4" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336849 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336855 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336860 2572 flags.go:64] FLAG: --register-node="true" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336865 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336870 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336877 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336880 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336887 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336890 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336894 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336898 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336901 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336912 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336916 2572 flags.go:64] FLAG: --runonce="false" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336919 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336922 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336926 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336929 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336932 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:26:05.341325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336935 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336939 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336946 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336951 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336956 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336961 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336966 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.336971 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337013 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337059 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337187 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337191 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337195 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337202 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337205 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337211 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337214 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337218 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337221 2572 flags.go:64] FLAG: --v="2" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337226 2572 flags.go:64] FLAG: --version="false" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337230 2572 flags.go:64] FLAG: --vmodule="" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337235 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.337239 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337335 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:05.341968 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337339 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337342 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337346 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337349 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337352 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337355 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337359 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337361 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337364 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337366 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337369 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337372 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337375 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337377 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337380 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337383 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337386 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337389 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337393 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337395 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:05.342538 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337398 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337400 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337403 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337405 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337408 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337410 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337413 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337415 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337418 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337421 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337425 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337441 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337444 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337447 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337450 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337453 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337456 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337459 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337461 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:05.343033 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337464 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337466 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337469 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337471 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337474 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337477 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337479 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337482 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337484 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337487 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337489 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337492 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337494 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337497 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337499 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337502 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337504 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337507 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337509 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337512 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:05.343499 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337514 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337518 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337522 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337526 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337529 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337532 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337534 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337537 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337540 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337542 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337545 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337547 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337550 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337553 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337555 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337558 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337561 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337563 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337566 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337568 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:05.343970 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337571 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337573 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337576 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337578 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337581 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.337583 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.338186 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.344307 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.344321 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344365 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344370 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344374 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344377 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344381 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344383 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344386 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:05.344463 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344389 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344392 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344394 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344397 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344399 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344402 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344404 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344407 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344409 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344411 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344414 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344416 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344419 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344421 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344424 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344427 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344445 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344448 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344451 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344453 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:05.344895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344457 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344459 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344462 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344464 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344469 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344473 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344476 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344479 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344482 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344485 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344488 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344491 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344494 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344496 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344499 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344501 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344504 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344506 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344509 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:05.345385 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344511 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344514 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344516 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344519 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344521 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344523 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344526 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344528 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344531 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344534 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344536 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344539 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344541 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344544 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344547 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344550 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344552 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344555 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344558 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344560 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344563 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:05.345895 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344565 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344568 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344570 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344572 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344575 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344577 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344580 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344582 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344585 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344587 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344590 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344592 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344595 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344597 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344601 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344605 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344608 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344610 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:05.346384 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344613 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.344624 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344740 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344745 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344748 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344751 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344754 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344757 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344760 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344762 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344765 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344768 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344770 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344773 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344775 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:05.346863 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344778 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344781 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344783 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344786 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344788 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344791 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344793 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344796 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344799 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344801 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344804 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344806 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344809 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344811 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344814 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344816 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344819 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344821 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344824 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:05.347225 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344827 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344829 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344832 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344834 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344837 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344839 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344842 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344850 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344853 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344856 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344873 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344876 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344879 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344883 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344886 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344888 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344892 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344896 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344899 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:05.347742 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344902 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344905 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344909 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344911 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344914 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344917 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344920 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344922 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344924 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344927 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344929 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344933 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344937 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344940 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344942 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344945 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344947 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344950 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344952 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344955 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:05.348253 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344957 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344961 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344963 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344966 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344968 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344971 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344973 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344976 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344978 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344980 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344983 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344985 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344988 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344990 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:05.344992 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.344997 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:05.348758 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.345089 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:26:05.349150 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.347956 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:26:05.349150 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.348885 2572 server.go:1019] "Starting client certificate rotation" Apr 17 14:26:05.349150 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.348982 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:26:05.349150 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.349018 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:26:05.375071 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.375052 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:26:05.378508 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.378488 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:26:05.397330 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.397312 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:26:05.402816 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.402801 2572 log.go:25] "Validated CRI v1 image API" Apr 17 14:26:05.403763 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.403747 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:26:05.404241 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.404227 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:26:05.412786 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.412762 2572 fs.go:135] Filesystem UUIDs: map[3c7fa6be-d273-48e1-808b-7c5b4ce95634:/dev/nvme0n1p4 63c9dbc1-b025-4d84-bdd0-75570138c052:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 14:26:05.412865 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.412784 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:26:05.418406 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.418303 2572 manager.go:217] Machine: {Timestamp:2026-04-17 14:26:05.416284847 +0000 UTC m=+0.401192143 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3196248 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2359aff09f3dd4524f627289c81cb9 SystemUUID:ec2359af-f09f-3dd4-524f-627289c81cb9 BootID:7f94008e-c1bb-4e0d-a336-70f202f463e1 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:67:2b:7e:a9:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:67:2b:7e:a9:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:cc:d6:ba:e8:3b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:26:05.418406 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.418402 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:26:05.418538 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.418525 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:26:05.419546 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.419525 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:26:05.419681 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.419548 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-190.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:26:05.419721 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.419690 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:26:05.419721 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.419698 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:26:05.419721 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.419711 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:26:05.421191 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.421180 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:26:05.421657 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.421640 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qmn6x" Apr 17 14:26:05.422010 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.422001 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:26:05.422107 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.422099 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:26:05.424605 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.424596 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:26:05.424644 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.424609 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:26:05.424644 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.424629 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:26:05.424644 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.424638 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:26:05.424788 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.424651 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:26:05.425701 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.425688 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:26:05.425771 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.425706 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:26:05.427748 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.427731 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qmn6x" Apr 17 14:26:05.428530 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.428510 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:26:05.430261 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.430248 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:26:05.431533 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431514 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:26:05.431533 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431532 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:26:05.431533 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431538 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431546 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431556 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431563 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431569 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431573 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431581 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431586 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431594 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:26:05.431670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.431604 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:26:05.432511 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.432497 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:26:05.432511 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.432507 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:26:05.436151 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.436136 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:26:05.436217 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.436169 2572 server.go:1295] "Started kubelet" Apr 17 14:26:05.436347 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.436299 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:26:05.436381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.436280 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:26:05.436419 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.436407 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:26:05.437060 ip-10-0-130-190 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:26:05.437565 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.437520 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.437630 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.437580 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:26:05.439224 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.439207 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:26:05.442026 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.442007 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-190.ec2.internal" not found Apr 17 14:26:05.443643 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.443627 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.445127 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.445108 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:26:05.445747 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.445731 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:26:05.446473 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446456 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:26:05.446473 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446458 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:26:05.446606 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446482 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:26:05.446606 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446586 2572 factory.go:55] Registering systemd factory Apr 17 14:26:05.446606 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446606 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:26:05.446725 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446624 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:26:05.446725 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446633 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:26:05.446812 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.446724 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-190.ec2.internal\" not found" Apr 17 14:26:05.446857 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446842 2572 factory.go:153] Registering CRI-O factory Apr 17 14:26:05.446857 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446856 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 14:26:05.446947 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446909 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:26:05.446947 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446935 2572 factory.go:103] Registering Raw factory Apr 17 14:26:05.447038 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.446949 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 14:26:05.447038 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.446967 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:26:05.447397 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.447384 2572 manager.go:319] Starting recovery of all containers Apr 17 14:26:05.447822 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.447803 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.450886 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.450851 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-190.ec2.internal\" not found" node="ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.457030 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.457010 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-190.ec2.internal" not found Apr 17 14:26:05.457120 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.457034 2572 manager.go:324] Recovery completed Apr 17 14:26:05.458806 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.458782 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 14:26:05.461521 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.461510 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:05.464085 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464072 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:05.464131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464099 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:05.464131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464111 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:05.464599 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464575 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:26:05.464599 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464590 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:26:05.464730 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.464607 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:26:05.466481 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.466471 2572 policy_none.go:49] "None policy: Start" Apr 17 14:26:05.466521 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.466486 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:26:05.466521 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.466495 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:26:05.500045 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500030 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.500066 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500078 2572 server.go:85] "Starting device plugin registration server" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500318 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500331 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500416 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500520 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.500528 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.501013 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:26:05.511520 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.501047 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-190.ec2.internal\" not found" Apr 17 14:26:05.519532 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.519514 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-190.ec2.internal" not found Apr 17 14:26:05.573972 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.573942 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:26:05.575079 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.575064 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:26:05.575186 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.575087 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:26:05.575186 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.575110 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:26:05.575186 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.575118 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:26:05.575322 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.575202 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:26:05.577076 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.577054 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.600824 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.600772 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:05.601544 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.601529 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:05.601611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.601559 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:05.601611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.601570 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:05.601611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.601590 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.610550 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.610532 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.610603 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:05.610555 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-190.ec2.internal\": node \"ip-10-0-130-190.ec2.internal\" not found" Apr 17 14:26:05.675326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.675270 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal"] Apr 17 14:26:05.677562 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.677547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.677562 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.677558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.703108 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.703087 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.707491 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.707477 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.715233 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.715220 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:26:05.720384 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.720365 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:26:05.848307 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.848276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.848307 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.848306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.848529 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.848323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fd5f4115a37b7b2be21e64925c15d47d-config\") pod \"kube-apiserver-proxy-ip-10-0-130-190.ec2.internal\" (UID: \"fd5f4115a37b7b2be21e64925c15d47d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fd5f4115a37b7b2be21e64925c15d47d-config\") pod \"kube-apiserver-proxy-ip-10-0-130-190.ec2.internal\" (UID: \"fd5f4115a37b7b2be21e64925c15d47d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949604 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fd5f4115a37b7b2be21e64925c15d47d-config\") pod \"kube-apiserver-proxy-ip-10-0-130-190.ec2.internal\" (UID: \"fd5f4115a37b7b2be21e64925c15d47d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949604 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:05.949604 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:05.949415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27f7804c58390a6a20607364ebeb663b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal\" (UID: \"27f7804c58390a6a20607364ebeb663b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:06.017523 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.017490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" Apr 17 14:26:06.022076 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.022054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" Apr 17 14:26:06.348855 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.348776 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:26:06.349611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.348942 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:06.349611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.348945 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:06.349611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.348965 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:06.425227 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.425055 2572 apiserver.go:52] "Watching apiserver" Apr 17 14:26:06.430400 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.430365 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:21:05 +0000 UTC" deadline="2027-10-24 10:27:01.484856086 +0000 UTC" Apr 17 14:26:06.430400 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.430393 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13316h0m55.054465424s" Apr 17 14:26:06.430608 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.430553 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:26:06.432291 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.432264 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2","openshift-cluster-node-tuning-operator/tuned-wmdfs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal","openshift-multus/network-metrics-daemon-b4mhh","openshift-network-operator/iptables-alerter-lcdd5","kube-system/konnectivity-agent-d6hck","openshift-dns/node-resolver-l7r6t","openshift-image-registry/node-ca-k494v","openshift-multus/multus-additional-cni-plugins-65fjv","openshift-multus/multus-brlgp","openshift-network-diagnostics/network-check-target-d4f88","openshift-ovn-kubernetes/ovnkube-node-nck2f","kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal"] Apr 17 14:26:06.433573 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.433551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.434710 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.434683 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.435713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.435753 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.435816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.435828 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.435908 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:06.435957 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.435922 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gkxh9\"" Apr 17 14:26:06.436880 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.436801 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.436880 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.436823 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8qdf9\"" Apr 17 14:26:06.436880 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.436837 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.436880 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.436876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.437863 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.437847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.438913 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.438645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xfn8s\"" Apr 17 14:26:06.438913 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.438888 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.438913 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.438913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.439111 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.439096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:26:06.439611 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.439593 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:26:06.439710 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.439681 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:26:06.439933 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.439917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.440014 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.439997 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.440014 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.440008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dw7qc\"" Apr 17 14:26:06.441045 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.441577 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441560 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.441816 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441802 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s8hdz\"" Apr 17 14:26:06.441906 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441806 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.441906 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441880 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.442002 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.441950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bj4fz\"" Apr 17 14:26:06.442473 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.442326 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:26:06.442574 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.442475 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.443203 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.442730 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.443203 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.443012 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:26:06.444250 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.444228 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.444396 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.444380 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:26:06.444913 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.444895 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nzr4f\"" Apr 17 14:26:06.444948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.444925 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.446085 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.445543 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:26:06.446085 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.445594 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:26:06.446085 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.445609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:06.446085 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.445685 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:06.447043 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.447000 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:26:06.447911 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.447892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.448176 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.448159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sfssv\"" Apr 17 14:26:06.449698 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.449681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.449826 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.449792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:26:06.449898 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.449813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l6hmw\"" Apr 17 14:26:06.450209 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.449998 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:26:06.450209 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.450056 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:26:06.450209 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.450199 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.451268 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.451393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-tuned\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.451393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-k8s-cni-cncf-io\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.451393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-multus-daemon-config\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.451393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-systemd-units\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwsx\" (UniqueName: \"kubernetes.io/projected/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-kube-api-access-mcwsx\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-run\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-var-lib-kubelet\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-node-log\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-config\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.451554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451535 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovn-node-metrics-cert\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-script-lib\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2fk\" (UniqueName: \"kubernetes.io/projected/fc19c040-93e2-4007-93c1-ee24954d0d5a-kube-api-access-qk2fk\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3261e7-2587-464e-ac8f-c31c1d9d88e8-host\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-cni-binary-copy\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-kubelet\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-etc-kubernetes\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.451767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdv5w\" (UniqueName: \"kubernetes.io/projected/70b0c40d-084b-491c-8390-f199b025b91b-kube-api-access-mdv5w\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-binary-copy\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-ovn\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-kubernetes\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-socket-dir-parent\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-bin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/29d21c65-516d-41f7-8313-d3fd5a97d74a-agent-certs\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-system-cni-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.451997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12f3294c-ef76-4702-8d03-5991c66cadb2-host-slash\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452064 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-host\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.452064 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452064 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-conf-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-kubelet\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452142 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-systemd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-var-lib-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-os-release\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-netns\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-netd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb87\" (UniqueName: \"kubernetes.io/projected/12f3294c-ef76-4702-8d03-5991c66cadb2-kube-api-access-flb87\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452409 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-etc-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjzr\" (UniqueName: \"kubernetes.io/projected/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-kube-api-access-zcjzr\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-device-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.452551 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-sys\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpm6q\" (UniqueName: \"kubernetes.io/projected/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-kube-api-access-qpm6q\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-modprobe-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-conf\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-lib-modules\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/29d21c65-516d-41f7-8313-d3fd5a97d74a-konnectivity-ca\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-os-release\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-log-socket\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-cnibin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-netns\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-hosts-file\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-tmp-dir\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.453254 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de3261e7-2587-464e-ac8f-c31c1d9d88e8-serviceca\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.452983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-hostroot\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453008 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cnibin\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453022 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-bin\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-env-overrides\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfg4n\" (UniqueName: \"kubernetes.io/projected/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-kube-api-access-tfg4n\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/12f3294c-ef76-4702-8d03-5991c66cadb2-iptables-alerter-script\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-multus\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6tm\" (UniqueName: \"kubernetes.io/projected/1e20b346-d933-444b-947f-2bb4b05a5b07-kube-api-access-5s6tm\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysconfig\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-tmp\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-systemd\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgh9b\" (UniqueName: \"kubernetes.io/projected/de3261e7-2587-464e-ac8f-c31c1d9d88e8-kube-api-access-cgh9b\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453362 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-system-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.453800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-multus-certs\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.454228 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.454228 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.453465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-slash\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.454889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.454874 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:26:06.476764 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.476743 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sjnp9" Apr 17 14:26:06.486554 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.486533 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sjnp9" Apr 17 14:26:06.508217 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.508175 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27f7804c58390a6a20607364ebeb663b.slice/crio-51e557003d5836c6ca2f0a39abe3e830b74fdb2807c7ec44af1b90640e710120 WatchSource:0}: Error finding container 51e557003d5836c6ca2f0a39abe3e830b74fdb2807c7ec44af1b90640e710120: Status 404 returned error can't find the container with id 51e557003d5836c6ca2f0a39abe3e830b74fdb2807c7ec44af1b90640e710120 Apr 17 14:26:06.508509 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.508487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5f4115a37b7b2be21e64925c15d47d.slice/crio-90d0c9e5ddab9bd25d09aee415270a344d602800d7a832bf6d7970dda8a5701f WatchSource:0}: Error finding container 90d0c9e5ddab9bd25d09aee415270a344d602800d7a832bf6d7970dda8a5701f: Status 404 returned error can't find the container with id 90d0c9e5ddab9bd25d09aee415270a344d602800d7a832bf6d7970dda8a5701f Apr 17 14:26:06.513335 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.513317 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:26:06.547670 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.547647 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:26:06.553727 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.553811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-tuned\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.553811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-k8s-cni-cncf-io\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.553811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-multus-daemon-config\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.553811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-systemd-units\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwsx\" (UniqueName: \"kubernetes.io/projected/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-kube-api-access-mcwsx\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-k8s-cni-cncf-io\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-systemd-units\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-run\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.554005 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.553984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-var-lib-kubelet\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-node-log\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-sys-fs\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-config\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-run\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovn-node-metrics-cert\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-node-log\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-var-lib-kubelet\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-script-lib\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2fk\" (UniqueName: \"kubernetes.io/projected/fc19c040-93e2-4007-93c1-ee24954d0d5a-kube-api-access-qk2fk\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3261e7-2587-464e-ac8f-c31c1d9d88e8-host\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-cni-binary-copy\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-kubelet\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de3261e7-2587-464e-ac8f-c31c1d9d88e8-host\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-etc-kubernetes\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdv5w\" (UniqueName: \"kubernetes.io/projected/70b0c40d-084b-491c-8390-f199b025b91b-kube-api-access-mdv5w\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.554318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-binary-copy\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-kubelet\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-multus-daemon-config\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554265 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-etc-kubernetes\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-ovn\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-kubernetes\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-socket-dir-parent\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-config\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-bin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-bin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/29d21c65-516d-41f7-8313-d3fd5a97d74a-agent-certs\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-system-cni-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-ovn\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70b0c40d-084b-491c-8390-f199b025b91b-cni-binary-copy\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-kubernetes\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12f3294c-ef76-4702-8d03-5991c66cadb2-host-slash\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.555029 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-system-cni-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/12f3294c-ef76-4702-8d03-5991c66cadb2-host-slash\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-binary-copy\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-socket-dir-parent\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-etc-selinux\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-host\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-conf-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovnkube-script-lib\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-kubelet\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-conf-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-systemd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.554963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-host\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-systemd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-kubelet\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-var-lib-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.555889 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-var-lib-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555079 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-ovn-kubernetes\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-os-release\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-netns\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-socket-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-netd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555280 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-run-netns\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flb87\" (UniqueName: \"kubernetes.io/projected/12f3294c-ef76-4702-8d03-5991c66cadb2-kube-api-access-flb87\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-os-release\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-etc-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-multus-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-etc-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.556678 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-netd\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjzr\" (UniqueName: \"kubernetes.io/projected/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-kube-api-access-zcjzr\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-device-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-sys\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpm6q\" (UniqueName: \"kubernetes.io/projected/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-kube-api-access-qpm6q\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-sys\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-modprobe-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-device-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-conf\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-run-openvswitch\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-lib-modules\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.555672 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/29d21c65-516d-41f7-8313-d3fd5a97d74a-konnectivity-ca\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-os-release\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-modprobe-d\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.557405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.555787 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:07.055727648 +0000 UTC m=+2.040634932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-lib-modules\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysctl-conf\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-log-socket\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-cnibin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-netns\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-os-release\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.555963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-log-socket\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-netns\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-cnibin\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-hosts-file\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-tmp-dir\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de3261e7-2587-464e-ac8f-c31c1d9d88e8-serviceca\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-hostroot\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cnibin\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-bin\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-cnibin\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-env-overrides\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/29d21c65-516d-41f7-8313-d3fd5a97d74a-konnectivity-ca\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc19c040-93e2-4007-93c1-ee24954d0d5a-registration-dir\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfg4n\" (UniqueName: \"kubernetes.io/projected/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-kube-api-access-tfg4n\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/12f3294c-ef76-4702-8d03-5991c66cadb2-iptables-alerter-script\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-multus\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-cni-bin\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-hosts-file\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6tm\" (UniqueName: \"kubernetes.io/projected/1e20b346-d933-444b-947f-2bb4b05a5b07-kube-api-access-5s6tm\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysconfig\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-tmp\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-systemd\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558499 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgh9b\" (UniqueName: \"kubernetes.io/projected/de3261e7-2587-464e-ac8f-c31c1d9d88e8-kube-api-access-cgh9b\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-system-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-tmp-dir\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-multus-certs\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-slash\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556716 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-env-overrides\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-host-slash\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-var-lib-cni-multus\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-systemd\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-sysconfig\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.556326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-hostroot\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-system-cni-dir\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70b0c40d-084b-491c-8390-f199b025b91b-host-run-multus-certs\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de3261e7-2587-464e-ac8f-c31c1d9d88e8-serviceca\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/12f3294c-ef76-4702-8d03-5991c66cadb2-iptables-alerter-script\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.558948 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.557970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-ovn-node-metrics-cert\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.559405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.558029 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-etc-tuned\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.559405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.558111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/29d21c65-516d-41f7-8313-d3fd5a97d74a-agent-certs\") pod \"konnectivity-agent-d6hck\" (UID: \"29d21c65-516d-41f7-8313-d3fd5a97d74a\") " pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.559405 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.558729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-tmp\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.561413 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.561390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwsx\" (UniqueName: \"kubernetes.io/projected/a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d-kube-api-access-mcwsx\") pod \"tuned-wmdfs\" (UID: \"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d\") " pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.561573 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.561560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdv5w\" (UniqueName: \"kubernetes.io/projected/70b0c40d-084b-491c-8390-f199b025b91b-kube-api-access-mdv5w\") pod \"multus-brlgp\" (UID: \"70b0c40d-084b-491c-8390-f199b025b91b\") " pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.561732 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.561715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2fk\" (UniqueName: \"kubernetes.io/projected/fc19c040-93e2-4007-93c1-ee24954d0d5a-kube-api-access-qk2fk\") pod \"aws-ebs-csi-driver-node-8zfj2\" (UID: \"fc19c040-93e2-4007-93c1-ee24954d0d5a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.565322 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.565307 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:06.565370 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.565324 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:06.565370 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.565334 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:06.565452 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:06.565378 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:07.065361875 +0000 UTC m=+2.050269158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:06.567295 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.567275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flb87\" (UniqueName: \"kubernetes.io/projected/12f3294c-ef76-4702-8d03-5991c66cadb2-kube-api-access-flb87\") pod \"iptables-alerter-lcdd5\" (UID: \"12f3294c-ef76-4702-8d03-5991c66cadb2\") " pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.567729 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.567713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjzr\" (UniqueName: \"kubernetes.io/projected/c607d8f0-4652-40d6-a3b8-74f2c8fcc998-kube-api-access-zcjzr\") pod \"ovnkube-node-nck2f\" (UID: \"c607d8f0-4652-40d6-a3b8-74f2c8fcc998\") " pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.567729 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.567723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpm6q\" (UniqueName: \"kubernetes.io/projected/f1097dd8-2309-4ac7-ae1d-b1ca093e2063-kube-api-access-qpm6q\") pod \"multus-additional-cni-plugins-65fjv\" (UID: \"f1097dd8-2309-4ac7-ae1d-b1ca093e2063\") " pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.568106 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.568089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6tm\" (UniqueName: \"kubernetes.io/projected/1e20b346-d933-444b-947f-2bb4b05a5b07-kube-api-access-5s6tm\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:06.568756 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.568737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgh9b\" (UniqueName: \"kubernetes.io/projected/de3261e7-2587-464e-ac8f-c31c1d9d88e8-kube-api-access-cgh9b\") pod \"node-ca-k494v\" (UID: \"de3261e7-2587-464e-ac8f-c31c1d9d88e8\") " pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.568857 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.568843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfg4n\" (UniqueName: \"kubernetes.io/projected/e7e5fd15-2e9b-40a4-90da-1410a8f629bd-kube-api-access-tfg4n\") pod \"node-resolver-l7r6t\" (UID: \"e7e5fd15-2e9b-40a4-90da-1410a8f629bd\") " pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.578473 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.578365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" event={"ID":"27f7804c58390a6a20607364ebeb663b","Type":"ContainerStarted","Data":"51e557003d5836c6ca2f0a39abe3e830b74fdb2807c7ec44af1b90640e710120"} Apr 17 14:26:06.579471 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.579452 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" event={"ID":"fd5f4115a37b7b2be21e64925c15d47d","Type":"ContainerStarted","Data":"90d0c9e5ddab9bd25d09aee415270a344d602800d7a832bf6d7970dda8a5701f"} Apr 17 14:26:06.767347 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.767228 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" Apr 17 14:26:06.773509 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.773484 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc19c040_93e2_4007_93c1_ee24954d0d5a.slice/crio-4458f04c2bad294ca8484532c205153e932c339ce910aef51969951888430ff4 WatchSource:0}: Error finding container 4458f04c2bad294ca8484532c205153e932c339ce910aef51969951888430ff4: Status 404 returned error can't find the container with id 4458f04c2bad294ca8484532c205153e932c339ce910aef51969951888430ff4 Apr 17 14:26:06.779403 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.779382 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" Apr 17 14:26:06.785581 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.785554 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d3e998_5f09_49e6_aecd_b23ad2e3ba0d.slice/crio-0f665e86d887165acba48185491c8fafad1a1023e98bd86410881076dfd4ba54 WatchSource:0}: Error finding container 0f665e86d887165acba48185491c8fafad1a1023e98bd86410881076dfd4ba54: Status 404 returned error can't find the container with id 0f665e86d887165acba48185491c8fafad1a1023e98bd86410881076dfd4ba54 Apr 17 14:26:06.795930 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.795906 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lcdd5" Apr 17 14:26:06.801475 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.801448 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f3294c_ef76_4702_8d03_5991c66cadb2.slice/crio-dc3d43c9194d842e1ef4596b14bcf2b16b7ba5a4aefcea75549987a1e788ea7b WatchSource:0}: Error finding container dc3d43c9194d842e1ef4596b14bcf2b16b7ba5a4aefcea75549987a1e788ea7b: Status 404 returned error can't find the container with id dc3d43c9194d842e1ef4596b14bcf2b16b7ba5a4aefcea75549987a1e788ea7b Apr 17 14:26:06.806527 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.806509 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:06.811179 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.811154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7r6t" Apr 17 14:26:06.813267 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.813239 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d21c65_516d_41f7_8313_d3fd5a97d74a.slice/crio-605ccf29aee70c1def8a8ca16e76160086f903e643838100e6b100102bedd90a WatchSource:0}: Error finding container 605ccf29aee70c1def8a8ca16e76160086f903e643838100e6b100102bedd90a: Status 404 returned error can't find the container with id 605ccf29aee70c1def8a8ca16e76160086f903e643838100e6b100102bedd90a Apr 17 14:26:06.817554 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.817534 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e5fd15_2e9b_40a4_90da_1410a8f629bd.slice/crio-9021dde14a995dc789721afa90683982838cc869aa13dfbf06c147ebbc75e92e WatchSource:0}: Error finding container 9021dde14a995dc789721afa90683982838cc869aa13dfbf06c147ebbc75e92e: Status 404 returned error can't find the container with id 9021dde14a995dc789721afa90683982838cc869aa13dfbf06c147ebbc75e92e Apr 17 14:26:06.818795 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.818728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k494v" Apr 17 14:26:06.824109 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.824089 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3261e7_2587_464e_ac8f_c31c1d9d88e8.slice/crio-5554a152db90ee9c4faefb3844157a0ceda6fad75e248e28c51a75ec6ad0d6d5 WatchSource:0}: Error finding container 5554a152db90ee9c4faefb3844157a0ceda6fad75e248e28c51a75ec6ad0d6d5: Status 404 returned error can't find the container with id 5554a152db90ee9c4faefb3844157a0ceda6fad75e248e28c51a75ec6ad0d6d5 Apr 17 14:26:06.824705 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.824686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65fjv" Apr 17 14:26:06.830600 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.830577 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-brlgp" Apr 17 14:26:06.830677 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.830608 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1097dd8_2309_4ac7_ae1d_b1ca093e2063.slice/crio-52e2bffa31b4679c973416bf734ec56d1f57ec4401929df958b63eb6dc4d01ac WatchSource:0}: Error finding container 52e2bffa31b4679c973416bf734ec56d1f57ec4401929df958b63eb6dc4d01ac: Status 404 returned error can't find the container with id 52e2bffa31b4679c973416bf734ec56d1f57ec4401929df958b63eb6dc4d01ac Apr 17 14:26:06.835820 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:06.835802 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:06.837874 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.837843 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70b0c40d_084b_491c_8390_f199b025b91b.slice/crio-53d13bf317e2e98491fe059cb65cb3b6c9f6642bde20ee1198f345348f0de8f1 WatchSource:0}: Error finding container 53d13bf317e2e98491fe059cb65cb3b6c9f6642bde20ee1198f345348f0de8f1: Status 404 returned error can't find the container with id 53d13bf317e2e98491fe059cb65cb3b6c9f6642bde20ee1198f345348f0de8f1 Apr 17 14:26:06.842657 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:06.842638 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc607d8f0_4652_40d6_a3b8_74f2c8fcc998.slice/crio-8033e0cd31722a411371889588810749486c4bedcf2becaae8b90f50e91e5891 WatchSource:0}: Error finding container 8033e0cd31722a411371889588810749486c4bedcf2becaae8b90f50e91e5891: Status 404 returned error can't find the container with id 8033e0cd31722a411371889588810749486c4bedcf2becaae8b90f50e91e5891 Apr 17 14:26:07.060597 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.060516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:07.060788 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.060653 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:07.060788 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.060742 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:08.060722558 +0000 UTC m=+3.045629861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:07.162207 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.161641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:07.162207 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.161780 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:07.162207 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.161797 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:07.162207 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.161809 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:07.162207 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.161877 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:08.16184563 +0000 UTC m=+3.146752926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:07.414454 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.414363 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:07.487517 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.487478 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:21:06 +0000 UTC" deadline="2027-10-29 09:58:47.747382437 +0000 UTC" Apr 17 14:26:07.487517 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.487513 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13435h32m40.259872615s" Apr 17 14:26:07.543813 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.543781 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:07.576148 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.576119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:07.576303 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:07.576242 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:07.588188 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.588156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"8033e0cd31722a411371889588810749486c4bedcf2becaae8b90f50e91e5891"} Apr 17 14:26:07.590746 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.590703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k494v" event={"ID":"de3261e7-2587-464e-ac8f-c31c1d9d88e8","Type":"ContainerStarted","Data":"5554a152db90ee9c4faefb3844157a0ceda6fad75e248e28c51a75ec6ad0d6d5"} Apr 17 14:26:07.593008 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.592984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7r6t" event={"ID":"e7e5fd15-2e9b-40a4-90da-1410a8f629bd","Type":"ContainerStarted","Data":"9021dde14a995dc789721afa90683982838cc869aa13dfbf06c147ebbc75e92e"} Apr 17 14:26:07.600220 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.600163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d6hck" event={"ID":"29d21c65-516d-41f7-8313-d3fd5a97d74a","Type":"ContainerStarted","Data":"605ccf29aee70c1def8a8ca16e76160086f903e643838100e6b100102bedd90a"} Apr 17 14:26:07.602531 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.602472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lcdd5" event={"ID":"12f3294c-ef76-4702-8d03-5991c66cadb2","Type":"ContainerStarted","Data":"dc3d43c9194d842e1ef4596b14bcf2b16b7ba5a4aefcea75549987a1e788ea7b"} Apr 17 14:26:07.616392 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.616341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" event={"ID":"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d","Type":"ContainerStarted","Data":"0f665e86d887165acba48185491c8fafad1a1023e98bd86410881076dfd4ba54"} Apr 17 14:26:07.620099 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.620039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" event={"ID":"fc19c040-93e2-4007-93c1-ee24954d0d5a","Type":"ContainerStarted","Data":"4458f04c2bad294ca8484532c205153e932c339ce910aef51969951888430ff4"} Apr 17 14:26:07.624230 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.624195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-brlgp" event={"ID":"70b0c40d-084b-491c-8390-f199b025b91b","Type":"ContainerStarted","Data":"53d13bf317e2e98491fe059cb65cb3b6c9f6642bde20ee1198f345348f0de8f1"} Apr 17 14:26:07.636801 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.636771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerStarted","Data":"52e2bffa31b4679c973416bf734ec56d1f57ec4401929df958b63eb6dc4d01ac"} Apr 17 14:26:07.942736 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:07.942703 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:08.068688 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:08.068633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:08.068851 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.068830 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:08.068922 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.068898 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:10.068877802 +0000 UTC m=+5.053785088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:08.170665 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:08.169939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:08.170665 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.170149 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:08.170665 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.170168 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:08.170665 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.170180 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:08.170665 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.170263 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:10.170244036 +0000 UTC m=+5.155151326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:08.488229 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:08.488185 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:21:06 +0000 UTC" deadline="2027-12-20 11:47:30.637185393 +0000 UTC" Apr 17 14:26:08.488229 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:08.488227 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14685h21m22.148963136s" Apr 17 14:26:08.576033 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:08.576002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:08.576204 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:08.576140 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:09.576209 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:09.576174 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:09.576700 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:09.576299 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:10.088399 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:10.088357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:10.088599 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.088529 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:10.088599 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.088592 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:14.088572583 +0000 UTC m=+9.073479869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:10.189908 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:10.189146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:10.189908 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.189381 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:10.189908 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.189452 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:10.189908 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.189467 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:10.189908 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.189527 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:14.189506518 +0000 UTC m=+9.174413825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:10.576226 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:10.576104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:10.576715 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:10.576258 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:11.575897 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:11.575858 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:11.576086 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:11.575994 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:12.576137 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:12.576104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:12.576534 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:12.576257 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:13.576126 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:13.576055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:13.576422 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:13.576380 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:14.122146 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:14.121535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:14.122146 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.121709 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:14.122146 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.121777 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:22.121755532 +0000 UTC m=+17.106662822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:14.222716 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:14.222676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:14.222910 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.222855 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:14.222910 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.222879 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:14.222910 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.222891 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:14.223066 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.222951 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:22.222932138 +0000 UTC m=+17.207839431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:14.576027 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:14.575986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:14.576223 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:14.576124 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:15.576248 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:15.576172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:15.576724 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:15.576276 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:16.575423 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:16.575340 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:16.575603 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:16.575477 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:17.576031 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:17.575984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:17.576499 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:17.576127 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:18.575886 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:18.575845 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:18.576062 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:18.575969 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:19.575712 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:19.575679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:19.575876 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:19.575782 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:20.575811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:20.575779 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:20.576201 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:20.575905 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:21.575921 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:21.575889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:21.576488 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:21.576005 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:22.176603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:22.176565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:22.176832 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.176718 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:22.176832 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.176778 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.176762514 +0000 UTC m=+33.161669803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:22.277182 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:22.277145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:22.277378 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.277325 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:22.277378 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.277348 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:22.277378 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.277361 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h6rgx for pod openshift-network-diagnostics/network-check-target-d4f88: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:22.277558 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.277424 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx podName:479a0d66-ba09-406a-9da8-b98589e81608 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.277407973 +0000 UTC m=+33.262315278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h6rgx" (UniqueName: "kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx") pod "network-check-target-d4f88" (UID: "479a0d66-ba09-406a-9da8-b98589e81608") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:22.575453 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:22.575359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:22.575589 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:22.575524 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:23.575561 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:23.575521 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:23.576006 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:23.575661 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:24.575959 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.575579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:24.575959 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:24.575919 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:24.683858 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.683690 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:26:24.684375 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.684347 2572 generic.go:358] "Generic (PLEG): container finished" podID="c607d8f0-4652-40d6-a3b8-74f2c8fcc998" containerID="3fc58b82511d71bdf7abd51e42802a9db5bc8c98c137f130be4ea1eb82bf70de" exitCode=1 Apr 17 14:26:24.684475 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.684420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"82f83d55d7d5180588d3dfeabc630b86c5af9876783c189fe0fbd1638c34fbf7"} Apr 17 14:26:24.684475 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.684467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"830b89d9930c88990e8da71a455bad0fa3fdc157762e292cf0473baa700d6a73"} Apr 17 14:26:24.684560 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.684484 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerDied","Data":"3fc58b82511d71bdf7abd51e42802a9db5bc8c98c137f130be4ea1eb82bf70de"} Apr 17 14:26:24.684560 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.684500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"2283bf84f05b10cfe42df118a26758498e846d1856f8015e4fd5bb5dbcb55191"} Apr 17 14:26:24.688010 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.687984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" event={"ID":"a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d","Type":"ContainerStarted","Data":"713901cefb7defa1b973952e229971fd044b06c55c21f9a5fba7ca3d136717d6"} Apr 17 14:26:24.689535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.689503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-brlgp" event={"ID":"70b0c40d-084b-491c-8390-f199b025b91b","Type":"ContainerStarted","Data":"cbe8b03eee6351a4e66bc4c5850fb8c74b8ef40de11343bc17dbed8454b151ec"} Apr 17 14:26:24.691284 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.691216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" event={"ID":"fd5f4115a37b7b2be21e64925c15d47d","Type":"ContainerStarted","Data":"502d3192b9268340907d24d4ccb5197121dc729886fec4e8038766769c8eff8b"} Apr 17 14:26:24.717028 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.715720 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wmdfs" podStartSLOduration=2.317569827 podStartE2EDuration="19.715701775s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.787070109 +0000 UTC m=+1.771977392" lastFinishedPulling="2026-04-17 14:26:24.185202045 +0000 UTC m=+19.170109340" observedRunningTime="2026-04-17 14:26:24.715027792 +0000 UTC m=+19.699935103" watchObservedRunningTime="2026-04-17 14:26:24.715701775 +0000 UTC m=+19.700609081" Apr 17 14:26:24.754400 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.754347 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-brlgp" podStartSLOduration=2.193734482 podStartE2EDuration="19.75433054s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.839541907 +0000 UTC m=+1.824449190" lastFinishedPulling="2026-04-17 14:26:24.400137953 +0000 UTC m=+19.385045248" observedRunningTime="2026-04-17 14:26:24.753942409 +0000 UTC m=+19.738849715" watchObservedRunningTime="2026-04-17 14:26:24.75433054 +0000 UTC m=+19.739237848" Apr 17 14:26:24.754741 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:24.754705 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-190.ec2.internal" podStartSLOduration=19.754693582 podStartE2EDuration="19.754693582s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:26:24.7278239 +0000 UTC m=+19.712731204" watchObservedRunningTime="2026-04-17 14:26:24.754693582 +0000 UTC m=+19.739600886" Apr 17 14:26:25.576924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.576834 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:25.577529 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:25.576958 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:25.694870 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.694832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d6hck" event={"ID":"29d21c65-516d-41f7-8313-d3fd5a97d74a","Type":"ContainerStarted","Data":"67d6d852aee32ed4b90a8e5ba8800c3703b051b868eabdcaf0159ccd6cc7266d"} Apr 17 14:26:25.696222 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.696192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lcdd5" event={"ID":"12f3294c-ef76-4702-8d03-5991c66cadb2","Type":"ContainerStarted","Data":"25318da3056a263cb937274cdb4885b246b246de46b5ab6b7d414bbb373ee857"} Apr 17 14:26:25.697595 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.697569 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" event={"ID":"fc19c040-93e2-4007-93c1-ee24954d0d5a","Type":"ContainerStarted","Data":"c59f8a5ba4ad9a144bc68ff1baf18dadfd07690a3e7d1f49e4ed8bf5480b202d"} Apr 17 14:26:25.698961 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.698937 2572 generic.go:358] "Generic (PLEG): container finished" podID="27f7804c58390a6a20607364ebeb663b" containerID="60094aefd520c08a2cec9ce4cfd3a6619f9f3875ad7e7cadb7dd99671a1bae7f" exitCode=0 Apr 17 14:26:25.699049 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.698965 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" event={"ID":"27f7804c58390a6a20607364ebeb663b","Type":"ContainerDied","Data":"60094aefd520c08a2cec9ce4cfd3a6619f9f3875ad7e7cadb7dd99671a1bae7f"} Apr 17 14:26:25.700473 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.700451 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="4d923e5c9568ad69850af914242ca67b24c20fe397dada785fea2952d8cd46ff" exitCode=0 Apr 17 14:26:25.700561 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.700539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"4d923e5c9568ad69850af914242ca67b24c20fe397dada785fea2952d8cd46ff"} Apr 17 14:26:25.703375 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.703360 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:26:25.703764 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.703742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"8d1950849246d15b05312b77606f6cab6a175563aa35483166e02e24252e33a9"} Apr 17 14:26:25.703845 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.703767 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"a6b8c894d4bd499ef11588941eb298bea06b8905aff4083cfdef6a815daa808a"} Apr 17 14:26:25.705180 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.705151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k494v" event={"ID":"de3261e7-2587-464e-ac8f-c31c1d9d88e8","Type":"ContainerStarted","Data":"e32be0551add64a13d7cca468ca1311d2d466ed241bb8ae8a1d3728b283ca1ee"} Apr 17 14:26:25.706582 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.706561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7r6t" event={"ID":"e7e5fd15-2e9b-40a4-90da-1410a8f629bd","Type":"ContainerStarted","Data":"027832165dd7986f75dedab7992c65ec743092452da8776e1093b95ab4185676"} Apr 17 14:26:25.711478 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.711441 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-d6hck" podStartSLOduration=3.677009651 podStartE2EDuration="20.711417246s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.814809459 +0000 UTC m=+1.799716755" lastFinishedPulling="2026-04-17 14:26:23.849217053 +0000 UTC m=+18.834124350" observedRunningTime="2026-04-17 14:26:25.7113564 +0000 UTC m=+20.696263716" watchObservedRunningTime="2026-04-17 14:26:25.711417246 +0000 UTC m=+20.696324555" Apr 17 14:26:25.737640 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.737585 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l7r6t" podStartSLOduration=3.341054937 podStartE2EDuration="20.737565608s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.819087203 +0000 UTC m=+1.803994486" lastFinishedPulling="2026-04-17 14:26:24.21559786 +0000 UTC m=+19.200505157" observedRunningTime="2026-04-17 14:26:25.724717188 +0000 UTC m=+20.709624517" watchObservedRunningTime="2026-04-17 14:26:25.737565608 +0000 UTC m=+20.722472915" Apr 17 14:26:25.737811 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.737691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k494v" podStartSLOduration=11.341697702 podStartE2EDuration="20.73768216s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.825606673 +0000 UTC m=+1.810513957" lastFinishedPulling="2026-04-17 14:26:16.221591127 +0000 UTC m=+11.206498415" observedRunningTime="2026-04-17 14:26:25.737141787 +0000 UTC m=+20.722049093" watchObservedRunningTime="2026-04-17 14:26:25.73768216 +0000 UTC m=+20.722589470" Apr 17 14:26:25.770962 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.770923 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lcdd5" podStartSLOduration=3.358554509 podStartE2EDuration="20.77090474s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.803155962 +0000 UTC m=+1.788063245" lastFinishedPulling="2026-04-17 14:26:24.21550618 +0000 UTC m=+19.200413476" observedRunningTime="2026-04-17 14:26:25.770584478 +0000 UTC m=+20.755491784" watchObservedRunningTime="2026-04-17 14:26:25.77090474 +0000 UTC m=+20.755812046" Apr 17 14:26:25.921656 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:25.921358 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:26:26.513057 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.512765 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:26:25.921582503Z","UUID":"83b775c7-8d84-4ea7-9dd6-ed4718f53f35","Handler":null,"Name":"","Endpoint":""} Apr 17 14:26:26.514705 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.514684 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:26:26.514705 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.514712 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:26:26.575724 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.575690 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:26.575856 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:26.575834 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:26.688115 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.688080 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:26.709960 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.709928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" event={"ID":"fc19c040-93e2-4007-93c1-ee24954d0d5a","Type":"ContainerStarted","Data":"8a56dd40a08f3ac39ebd0901bc828671fb4a164818d0ec4d5577ce0c445a78a0"} Apr 17 14:26:26.711393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.711364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" event={"ID":"27f7804c58390a6a20607364ebeb663b","Type":"ContainerStarted","Data":"d7005b5c4e0ace5a14569003cbdebf6332349b2eb0521d306b1880d676c56fb1"} Apr 17 14:26:26.730569 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:26.730523 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-190.ec2.internal" podStartSLOduration=21.730511671 podStartE2EDuration="21.730511671s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:26:26.729889841 +0000 UTC m=+21.714797143" watchObservedRunningTime="2026-04-17 14:26:26.730511671 +0000 UTC m=+21.715418977" Apr 17 14:26:27.575854 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:27.575821 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:27.576118 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:27.575942 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:27.715944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:27.715881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" event={"ID":"fc19c040-93e2-4007-93c1-ee24954d0d5a","Type":"ContainerStarted","Data":"2a28b9ae5b18d3a7d5d3c34cd858b94588e378ad4665fbc82366ca50d7a4d88d"} Apr 17 14:26:27.719044 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:27.719015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:26:27.719484 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:27.719458 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"4f9b26998ae2348a00e8503e06b6af9127c0567ba9f66362eff9a592506a9923"} Apr 17 14:26:27.732986 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:27.732943 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8zfj2" podStartSLOduration=2.751456846 podStartE2EDuration="22.732929459s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.775000244 +0000 UTC m=+1.759907528" lastFinishedPulling="2026-04-17 14:26:26.756472854 +0000 UTC m=+21.741380141" observedRunningTime="2026-04-17 14:26:27.732396698 +0000 UTC m=+22.717304005" watchObservedRunningTime="2026-04-17 14:26:27.732929459 +0000 UTC m=+22.717836764" Apr 17 14:26:28.153259 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:28.153227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:28.153832 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:28.153803 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:28.576147 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:28.576065 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:28.576311 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:28.576196 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:28.721949 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:28.721922 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-d6hck" Apr 17 14:26:29.575415 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:29.575377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:29.575607 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:29.575510 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:30.576319 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.576136 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:30.576771 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:30.576394 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:30.726795 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.726768 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:26:30.727065 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.727045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"e7c8a82343c8ea0b24e3e93c0eb08f6f38843c2ea07a98367b0baeb0a13e853a"} Apr 17 14:26:30.727342 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.727311 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:30.727574 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.727558 2572 scope.go:117] "RemoveContainer" containerID="3fc58b82511d71bdf7abd51e42802a9db5bc8c98c137f130be4ea1eb82bf70de" Apr 17 14:26:30.728742 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.728722 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="ba3dca8336340cd4cd684fc2ad5c1d2907af33dd96770e70d65222048f0b2b0d" exitCode=0 Apr 17 14:26:30.728840 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.728776 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"ba3dca8336340cd4cd684fc2ad5c1d2907af33dd96770e70d65222048f0b2b0d"} Apr 17 14:26:30.743210 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:30.743192 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:31.575935 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.575773 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:31.576033 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:31.576006 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:31.732939 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.732917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:26:31.733355 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.733244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" event={"ID":"c607d8f0-4652-40d6-a3b8-74f2c8fcc998","Type":"ContainerStarted","Data":"6d13699d68401f5da495f9e513e27aaa91684293967da636316516eb6e2489bd"} Apr 17 14:26:31.733533 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.733504 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:31.733608 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.733536 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:31.735168 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.735142 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="be96dbaab87d1f9e9a0baaba311e98b1557bef0d45a2deaaed7fd01541205070" exitCode=0 Apr 17 14:26:31.735251 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.735172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"be96dbaab87d1f9e9a0baaba311e98b1557bef0d45a2deaaed7fd01541205070"} Apr 17 14:26:31.748013 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.747983 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:26:31.755667 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.755643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4f88"] Apr 17 14:26:31.755762 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.755739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:31.755832 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:31.755815 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:31.760018 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.759984 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" podStartSLOduration=9.319033551 podStartE2EDuration="26.759972679s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.844159794 +0000 UTC m=+1.829067078" lastFinishedPulling="2026-04-17 14:26:24.285098922 +0000 UTC m=+19.270006206" observedRunningTime="2026-04-17 14:26:31.758663659 +0000 UTC m=+26.743570974" watchObservedRunningTime="2026-04-17 14:26:31.759972679 +0000 UTC m=+26.744879962" Apr 17 14:26:31.762132 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.762114 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b4mhh"] Apr 17 14:26:31.762229 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:31.762217 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:31.762328 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:31.762311 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:32.738545 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:32.738517 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="a7b8df0b1d9a64fe1a094311bc04f025ff9eaa8dad2732cd9255203b7ef9a07a" exitCode=0 Apr 17 14:26:32.738960 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:32.738577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"a7b8df0b1d9a64fe1a094311bc04f025ff9eaa8dad2732cd9255203b7ef9a07a"} Apr 17 14:26:33.576265 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:33.576191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:33.576265 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:33.576220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:33.576423 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:33.576331 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:33.576478 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:33.576418 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:35.576778 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:35.576703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:35.577341 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:35.576797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:35.577341 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:35.576829 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4f88" podUID="479a0d66-ba09-406a-9da8-b98589e81608" Apr 17 14:26:35.577341 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:35.576863 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:26:37.319497 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.319218 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-190.ec2.internal" event="NodeReady" Apr 17 14:26:37.319947 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.319535 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:26:37.353049 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.353014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:26:37.388545 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.388468 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft"] Apr 17 14:26:37.389392 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.388911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.394267 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.394213 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:26:37.394522 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.394450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:26:37.394661 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.394535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gbrc2\"" Apr 17 14:26:37.394661 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.394535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:26:37.397823 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.397657 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:26:37.403543 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.403524 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb"] Apr 17 14:26:37.403662 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.403646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.405825 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.405806 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-khdsc\"" Apr 17 14:26:37.405932 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.405863 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 14:26:37.405932 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.405874 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 14:26:37.406026 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.405990 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 14:26:37.406146 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.406082 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 14:26:37.428726 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.428694 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6"] Apr 17 14:26:37.428864 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.428844 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.430961 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.430938 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 14:26:37.431075 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.431059 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 14:26:37.431185 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.431162 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 14:26:37.431298 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.431207 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 14:26:37.447617 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.447599 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmj2"] Apr 17 14:26:37.447841 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.447816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.450105 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.450086 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 14:26:37.464400 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.464372 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-slgs9"] Apr 17 14:26:37.464547 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.464527 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.466806 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.466777 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:26:37.466806 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.466792 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:26:37.466806 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.466802 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qz9tk\"" Apr 17 14:26:37.467109 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.467089 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:26:37.479302 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479284 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft"] Apr 17 14:26:37.479381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479312 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:26:37.479381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479328 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb"] Apr 17 14:26:37.479381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479340 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6"] Apr 17 14:26:37.479381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479351 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmj2"] Apr 17 14:26:37.479381 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479360 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-slgs9"] Apr 17 14:26:37.479625 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.479426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.481773 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.481754 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:26:37.481863 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.481760 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p2qzh\"" Apr 17 14:26:37.481863 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.481805 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:26:37.489562 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.489660 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.489660 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.489660 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbhjn\" (UniqueName: \"kubernetes.io/projected/4a34165c-c742-40e7-b117-8bc0046f32d4-kube-api-access-jbhjn\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.489800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.489800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.489800 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjcp\" (UniqueName: \"kubernetes.io/projected/1be854c3-3a12-46a0-9f6e-9e1fccba5835-kube-api-access-ggjcp\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.489928 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1be854c3-3a12-46a0-9f6e-9e1fccba5835-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.489928 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.489928 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4a34165c-c742-40e7-b117-8bc0046f32d4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.490047 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.490047 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.489966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.490047 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.490005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.490159 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.490045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.490159 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.490069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2wxb\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.490159 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.490089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.575553 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.575477 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:37.575715 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.575481 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:37.578371 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.578064 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:26:37.578371 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.578079 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqr8w\"" Apr 17 14:26:37.578371 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.578084 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:26:37.578371 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.578102 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w222h\"" Apr 17 14:26:37.578371 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.578117 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:26:37.591289 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.591389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbhjn\" (UniqueName: \"kubernetes.io/projected/4a34165c-c742-40e7-b117-8bc0046f32d4-kube-api-access-jbhjn\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.591389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591358 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.591389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4663e8d-b134-432d-b180-efed510f0b7e-tmp-dir\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.591603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjcp\" (UniqueName: \"kubernetes.io/projected/1be854c3-3a12-46a0-9f6e-9e1fccba5835-kube-api-access-ggjcp\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.591603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1be854c3-3a12-46a0-9f6e-9e1fccba5835-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.591603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591603 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4663e8d-b134-432d-b180-efed510f0b7e-config-volume\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591617 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39ea32e8-f57b-4997-8172-21d875d83841-tmp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591686 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4a34165c-c742-40e7-b117-8bc0046f32d4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/39ea32e8-f57b-4997-8172-21d875d83841-klusterlet-config\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.591882 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hbp\" (UniqueName: \"kubernetes.io/projected/067a27fb-e850-42e3-8f46-7d062e8e4ac4-kube-api-access-v6hbp\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591913 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2wxb\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57fc\" (UniqueName: \"kubernetes.io/projected/e4663e8d-b134-432d-b180-efed510f0b7e-kube-api-access-z57fc\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.591995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkqp\" (UniqueName: \"kubernetes.io/projected/39ea32e8-f57b-4997-8172-21d875d83841-kube-api-access-mbkqp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.592027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.592326 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.592055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.592708 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.592522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4a34165c-c742-40e7-b117-8bc0046f32d4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.592708 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.592666 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:37.592708 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.592682 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:37.592831 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.592744 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.092724525 +0000 UTC m=+33.077631822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:37.593307 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.593127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.593415 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.593372 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.596881 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.596661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.596881 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.596729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.596881 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.596801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.596881 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.596813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.597107 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.596888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4a34165c-c742-40e7-b117-8bc0046f32d4-ca\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.597107 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.597074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1be854c3-3a12-46a0-9f6e-9e1fccba5835-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.597424 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.597382 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.599403 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.599375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.599533 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.599520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjcp\" (UniqueName: \"kubernetes.io/projected/1be854c3-3a12-46a0-9f6e-9e1fccba5835-kube-api-access-ggjcp\") pod \"managed-serviceaccount-addon-agent-564656ff9f-8rvft\" (UID: \"1be854c3-3a12-46a0-9f6e-9e1fccba5835\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.599704 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.599677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbhjn\" (UniqueName: \"kubernetes.io/projected/4a34165c-c742-40e7-b117-8bc0046f32d4-kube-api-access-jbhjn\") pod \"cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb\" (UID: \"4a34165c-c742-40e7-b117-8bc0046f32d4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.600354 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.600327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2wxb\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:37.692586 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4663e8d-b134-432d-b180-efed510f0b7e-tmp-dir\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.692796 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4663e8d-b134-432d-b180-efed510f0b7e-config-volume\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.692796 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.692796 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39ea32e8-f57b-4997-8172-21d875d83841-tmp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/39ea32e8-f57b-4997-8172-21d875d83841-klusterlet-config\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hbp\" (UniqueName: \"kubernetes.io/projected/067a27fb-e850-42e3-8f46-7d062e8e4ac4-kube-api-access-v6hbp\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.692878 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z57fc\" (UniqueName: \"kubernetes.io/projected/e4663e8d-b134-432d-b180-efed510f0b7e-kube-api-access-z57fc\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.692928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkqp\" (UniqueName: \"kubernetes.io/projected/39ea32e8-f57b-4997-8172-21d875d83841-kube-api-access-mbkqp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.692968 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.692947 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.192925382 +0000 UTC m=+33.177832681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:37.693252 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.693022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4663e8d-b134-432d-b180-efed510f0b7e-tmp-dir\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.693252 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.693091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39ea32e8-f57b-4997-8172-21d875d83841-tmp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.693252 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.693163 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:37.693252 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:37.693217 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.193202099 +0000 UTC m=+33.178109384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:37.693386 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.693311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4663e8d-b134-432d-b180-efed510f0b7e-config-volume\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.695739 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.695717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/39ea32e8-f57b-4997-8172-21d875d83841-klusterlet-config\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.710110 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.710090 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hbp\" (UniqueName: \"kubernetes.io/projected/067a27fb-e850-42e3-8f46-7d062e8e4ac4-kube-api-access-v6hbp\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:37.710226 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.710120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkqp\" (UniqueName: \"kubernetes.io/projected/39ea32e8-f57b-4997-8172-21d875d83841-kube-api-access-mbkqp\") pod \"klusterlet-addon-workmgr-f987cb679-5gdd6\" (UID: \"39ea32e8-f57b-4997-8172-21d875d83841\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:37.710272 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.710237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57fc\" (UniqueName: \"kubernetes.io/projected/e4663e8d-b134-432d-b180-efed510f0b7e-kube-api-access-z57fc\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:37.720060 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.720039 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" Apr 17 14:26:37.738011 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.737986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:26:37.767101 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:37.767076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:38.097049 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.097012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:38.097232 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.097186 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:38.097232 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.097208 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:38.097340 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.097273 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:39.097252752 +0000 UTC m=+34.082160040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:38.197964 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.197916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:38.197964 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.197965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.198016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198067 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198108 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198142 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:26:39.19812058 +0000 UTC m=+34.183027882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198144 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198165 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:10.198149181 +0000 UTC m=+65.183056467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : secret "metrics-daemon-secret" not found Apr 17 14:26:38.198194 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:38.198181 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:39.198172284 +0000 UTC m=+34.183079568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:38.298756 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.298714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:38.301687 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.301658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6rgx\" (UniqueName: \"kubernetes.io/projected/479a0d66-ba09-406a-9da8-b98589e81608-kube-api-access-h6rgx\") pod \"network-check-target-d4f88\" (UID: \"479a0d66-ba09-406a-9da8-b98589e81608\") " pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:38.486953 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:38.486919 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:39.106055 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.106018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:39.106289 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.106191 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:39.106289 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.106213 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:39.106289 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.106280 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:41.106260613 +0000 UTC m=+36.091167903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:39.206837 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.206796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:39.207020 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.206916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:39.207020 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.206975 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:39.207142 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.207038 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:39.207142 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.207055 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:41.207034753 +0000 UTC m=+36.191942058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:39.207142 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:39.207086 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:26:41.207071889 +0000 UTC m=+36.191979172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:39.504247 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.504217 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6"] Apr 17 14:26:39.507323 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.507300 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb"] Apr 17 14:26:39.509705 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.509683 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft"] Apr 17 14:26:39.518007 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.517972 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4f88"] Apr 17 14:26:39.583342 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:39.583313 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ea32e8_f57b_4997_8172_21d875d83841.slice/crio-67ced5402126fffaba1600cca8b935cae85591a228b2ff5b7894a4479f3019e9 WatchSource:0}: Error finding container 67ced5402126fffaba1600cca8b935cae85591a228b2ff5b7894a4479f3019e9: Status 404 returned error can't find the container with id 67ced5402126fffaba1600cca8b935cae85591a228b2ff5b7894a4479f3019e9 Apr 17 14:26:39.585386 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:39.584405 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a34165c_c742_40e7_b117_8bc0046f32d4.slice/crio-4017051ef2604385f25090a5a62f2915ef5d09bf7fb097f5de0be546948057da WatchSource:0}: Error finding container 4017051ef2604385f25090a5a62f2915ef5d09bf7fb097f5de0be546948057da: Status 404 returned error can't find the container with id 4017051ef2604385f25090a5a62f2915ef5d09bf7fb097f5de0be546948057da Apr 17 14:26:39.758511 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.758329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4f88" event={"ID":"479a0d66-ba09-406a-9da8-b98589e81608","Type":"ContainerStarted","Data":"4cf7de9133a96180600e1d0ed2034f764999fb9ce29039703b62e7c017484179"} Apr 17 14:26:39.759219 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.759197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" event={"ID":"1be854c3-3a12-46a0-9f6e-9e1fccba5835","Type":"ContainerStarted","Data":"193a6cd758684a1e914f13404d8e1886c19303e516a349c4bd501577d1883926"} Apr 17 14:26:39.760055 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.760033 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" event={"ID":"39ea32e8-f57b-4997-8172-21d875d83841","Type":"ContainerStarted","Data":"67ced5402126fffaba1600cca8b935cae85591a228b2ff5b7894a4479f3019e9"} Apr 17 14:26:39.760914 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:39.760892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerStarted","Data":"4017051ef2604385f25090a5a62f2915ef5d09bf7fb097f5de0be546948057da"} Apr 17 14:26:40.770868 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:40.769923 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="04b82cd31aca31631e096ef5ec6ecf2ea5df95b09fbb7f5864cbeb44575505a3" exitCode=0 Apr 17 14:26:40.770868 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:40.770000 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"04b82cd31aca31631e096ef5ec6ecf2ea5df95b09fbb7f5864cbeb44575505a3"} Apr 17 14:26:41.123797 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:41.123754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:41.124036 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.124020 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:41.124098 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.124040 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:41.124150 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.124101 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:45.124080819 +0000 UTC m=+40.108988116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:41.225113 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:41.225074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:41.225288 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:41.225147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:41.225288 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.225222 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:41.225288 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.225247 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:41.225452 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.225293 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:45.225274516 +0000 UTC m=+40.210181801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:41.225452 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:41.225319 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:26:45.225306143 +0000 UTC m=+40.210213431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:41.781757 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:41.780828 2572 generic.go:358] "Generic (PLEG): container finished" podID="f1097dd8-2309-4ac7-ae1d-b1ca093e2063" containerID="2b45e4137f6b73834138410fd9d4cb81c20e7d5373fc3ecd8c4d1a5e07d83925" exitCode=0 Apr 17 14:26:41.781757 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:41.780886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerDied","Data":"2b45e4137f6b73834138410fd9d4cb81c20e7d5373fc3ecd8c4d1a5e07d83925"} Apr 17 14:26:45.157482 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:45.157419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:45.157992 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.157574 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:45.157992 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.157595 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:45.157992 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.157650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:53.157634609 +0000 UTC m=+48.142541896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:45.258408 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:45.258370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:45.258566 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:45.258420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:45.258566 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.258522 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:45.258566 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.258525 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:45.258566 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.258565 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:53.258552633 +0000 UTC m=+48.243459916 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:45.258731 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:45.258582 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:26:53.258574305 +0000 UTC m=+48.243481588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:47.794768 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.794684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerStarted","Data":"69dafdc5e8c50461643a35ec16b96dde257b7b5f9561667fe17ab1fae29e214f"} Apr 17 14:26:47.797485 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.797459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65fjv" event={"ID":"f1097dd8-2309-4ac7-ae1d-b1ca093e2063","Type":"ContainerStarted","Data":"56bcf743e24fa40fe6bde516be2f820971b39e4d7c4138799cebf96a8154b9da"} Apr 17 14:26:47.798752 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.798728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4f88" event={"ID":"479a0d66-ba09-406a-9da8-b98589e81608","Type":"ContainerStarted","Data":"1d0527779bbfc3776dbfcde20278970611af1a633cb654367dbb98ccfa56153c"} Apr 17 14:26:47.798903 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.798876 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:26:47.799881 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.799859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" event={"ID":"1be854c3-3a12-46a0-9f6e-9e1fccba5835","Type":"ContainerStarted","Data":"94ce0ead4b704d6c8a07752c51d61c70e2710f50af42c0dd6908e79b176c46ab"} Apr 17 14:26:47.800956 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.800938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" event={"ID":"39ea32e8-f57b-4997-8172-21d875d83841","Type":"ContainerStarted","Data":"ffd76bb579fd5e00de86393a7fdd29bd1fe166e804608a38d88be8fb20850114"} Apr 17 14:26:47.801179 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.801158 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:47.802610 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.802593 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:26:47.819250 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.819207 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-65fjv" podStartSLOduration=10.013178041 podStartE2EDuration="42.819195159s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:06.832149725 +0000 UTC m=+1.817057013" lastFinishedPulling="2026-04-17 14:26:39.638166848 +0000 UTC m=+34.623074131" observedRunningTime="2026-04-17 14:26:47.817265996 +0000 UTC m=+42.802173302" watchObservedRunningTime="2026-04-17 14:26:47.819195159 +0000 UTC m=+42.804102463" Apr 17 14:26:47.832135 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.832096 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" podStartSLOduration=11.20662369 podStartE2EDuration="18.832087442s" podCreationTimestamp="2026-04-17 14:26:29 +0000 UTC" firstStartedPulling="2026-04-17 14:26:39.588895409 +0000 UTC m=+34.573802693" lastFinishedPulling="2026-04-17 14:26:47.214359162 +0000 UTC m=+42.199266445" observedRunningTime="2026-04-17 14:26:47.831374594 +0000 UTC m=+42.816281899" watchObservedRunningTime="2026-04-17 14:26:47.832087442 +0000 UTC m=+42.816994747" Apr 17 14:26:47.844925 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.844883 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d4f88" podStartSLOduration=35.246374092 podStartE2EDuration="42.844870697s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:26:39.616773222 +0000 UTC m=+34.601680507" lastFinishedPulling="2026-04-17 14:26:47.215269816 +0000 UTC m=+42.200177112" observedRunningTime="2026-04-17 14:26:47.84423755 +0000 UTC m=+42.829144855" watchObservedRunningTime="2026-04-17 14:26:47.844870697 +0000 UTC m=+42.829778002" Apr 17 14:26:47.858605 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:47.858566 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" podStartSLOduration=11.214249758 podStartE2EDuration="18.858555439s" podCreationTimestamp="2026-04-17 14:26:29 +0000 UTC" firstStartedPulling="2026-04-17 14:26:39.58642017 +0000 UTC m=+34.571327457" lastFinishedPulling="2026-04-17 14:26:47.230725847 +0000 UTC m=+42.215633138" observedRunningTime="2026-04-17 14:26:47.858108147 +0000 UTC m=+42.843015464" watchObservedRunningTime="2026-04-17 14:26:47.858555439 +0000 UTC m=+42.843462743" Apr 17 14:26:48.945976 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:48.945941 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l4b6r"] Apr 17 14:26:48.963931 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:48.963899 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l4b6r"] Apr 17 14:26:48.964086 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:48.964027 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:48.966351 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:48.966328 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:26:49.088506 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.088459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-dbus\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.088705 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.088584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-kubelet-config\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.088769 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.088701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de388feb-3144-4718-889c-452bc20215d4-original-pull-secret\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.189663 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.189626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-kubelet-config\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.189823 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.189698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de388feb-3144-4718-889c-452bc20215d4-original-pull-secret\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.189823 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.189749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-dbus\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.189823 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.189750 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-kubelet-config\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.189974 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.189960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de388feb-3144-4718-889c-452bc20215d4-dbus\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.193393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.193360 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de388feb-3144-4718-889c-452bc20215d4-original-pull-secret\") pod \"global-pull-secret-syncer-l4b6r\" (UID: \"de388feb-3144-4718-889c-452bc20215d4\") " pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.273474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.273371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l4b6r" Apr 17 14:26:49.388329 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.388297 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l4b6r"] Apr 17 14:26:49.402224 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:26:49.402198 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde388feb_3144_4718_889c_452bc20215d4.slice/crio-7acad0ba5ad1dcc2ab8e44f0f182f12a96fe6218e11bd1eeaa1dffd548c10e4a WatchSource:0}: Error finding container 7acad0ba5ad1dcc2ab8e44f0f182f12a96fe6218e11bd1eeaa1dffd548c10e4a: Status 404 returned error can't find the container with id 7acad0ba5ad1dcc2ab8e44f0f182f12a96fe6218e11bd1eeaa1dffd548c10e4a Apr 17 14:26:49.806296 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:49.806258 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l4b6r" event={"ID":"de388feb-3144-4718-889c-452bc20215d4","Type":"ContainerStarted","Data":"7acad0ba5ad1dcc2ab8e44f0f182f12a96fe6218e11bd1eeaa1dffd548c10e4a"} Apr 17 14:26:51.815066 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:51.815010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerStarted","Data":"b3810068416f7c6d59cadeca9fa52a3e5674383a07c183d870d37ce07833f707"} Apr 17 14:26:52.821097 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:52.821066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerStarted","Data":"55a3f944380f5d8c1359ac660ad148744f702556cbd18940ba390ebfe9a809ab"} Apr 17 14:26:52.838671 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:52.838615 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" podStartSLOduration=11.840948762 podStartE2EDuration="23.838597476s" podCreationTimestamp="2026-04-17 14:26:29 +0000 UTC" firstStartedPulling="2026-04-17 14:26:39.588139088 +0000 UTC m=+34.573046385" lastFinishedPulling="2026-04-17 14:26:51.585787801 +0000 UTC m=+46.570695099" observedRunningTime="2026-04-17 14:26:52.83762277 +0000 UTC m=+47.822530300" watchObservedRunningTime="2026-04-17 14:26:52.838597476 +0000 UTC m=+47.823504782" Apr 17 14:26:53.223793 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:53.223746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:26:53.223967 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.223913 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:26:53.223967 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.223937 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:26:53.224093 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.224004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:09.223982807 +0000 UTC m=+64.208890094 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:26:53.325132 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:53.325094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:26:53.325315 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:53.325157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:26:53.325315 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.325262 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:53.325420 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.325327 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:09.325312614 +0000 UTC m=+64.310219898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:26:53.325420 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.325265 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:53.325549 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:26:53.325419 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:27:09.325397566 +0000 UTC m=+64.310304873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:26:54.829871 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:54.829825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l4b6r" event={"ID":"de388feb-3144-4718-889c-452bc20215d4","Type":"ContainerStarted","Data":"06e85c8ba8bcb630e90c1fdd2f42609cbbcd82d9fcbf41e9037bcd0336e07f5b"} Apr 17 14:26:54.843891 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:26:54.843850 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l4b6r" podStartSLOduration=2.370458211 podStartE2EDuration="6.843837007s" podCreationTimestamp="2026-04-17 14:26:48 +0000 UTC" firstStartedPulling="2026-04-17 14:26:49.403844573 +0000 UTC m=+44.388751856" lastFinishedPulling="2026-04-17 14:26:53.877223368 +0000 UTC m=+48.862130652" observedRunningTime="2026-04-17 14:26:54.84272664 +0000 UTC m=+49.827633950" watchObservedRunningTime="2026-04-17 14:26:54.843837007 +0000 UTC m=+49.828744342" Apr 17 14:27:03.751886 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:03.751855 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nck2f" Apr 17 14:27:09.231477 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:09.231409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:27:09.231869 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.231561 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:27:09.231869 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.231582 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:27:09.231869 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.231646 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:41.23163003 +0000 UTC m=+96.216537313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:27:09.332032 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:09.331996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:27:09.332146 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:09.332053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:27:09.332189 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.332158 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:27:09.332220 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.332213 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:41.332199691 +0000 UTC m=+96.317106975 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:27:09.332266 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.332158 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:27:09.332305 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:09.332295 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:27:41.332282334 +0000 UTC m=+96.317189630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:27:10.238320 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:10.238275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:27:10.238724 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:10.238451 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:27:10.238724 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:10.238520 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.238504853 +0000 UTC m=+129.223412136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : secret "metrics-daemon-secret" not found Apr 17 14:27:18.805915 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:18.805886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d4f88" Apr 17 14:27:41.267535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:41.267382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:27:41.268006 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.267542 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:27:41.268006 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.267566 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5f5d55cb-66jv6: secret "image-registry-tls" not found Apr 17 14:27:41.268006 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.267650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls podName:744f381c-27ec-4d42-9ea2-2346a9303e65 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:45.267629739 +0000 UTC m=+160.252537037 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls") pod "image-registry-d5f5d55cb-66jv6" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65") : secret "image-registry-tls" not found Apr 17 14:27:41.368053 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:41.368016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:27:41.368197 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:27:41.368074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:27:41.368197 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.368166 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:27:41.368269 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.368241 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls podName:e4663e8d-b134-432d-b180-efed510f0b7e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:45.368225148 +0000 UTC m=+160.353132436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls") pod "dns-default-slgs9" (UID: "e4663e8d-b134-432d-b180-efed510f0b7e") : secret "dns-default-metrics-tls" not found Apr 17 14:27:41.368269 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.368175 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:27:41.368351 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:27:41.368300 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert podName:067a27fb-e850-42e3-8f46-7d062e8e4ac4 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:45.368288708 +0000 UTC m=+160.353195992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert") pod "ingress-canary-qfmj2" (UID: "067a27fb-e850-42e3-8f46-7d062e8e4ac4") : secret "canary-serving-cert" not found Apr 17 14:28:06.664128 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:06.664100 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l7r6t_e7e5fd15-2e9b-40a4-90da-1410a8f629bd/dns-node-resolver/0.log" Apr 17 14:28:07.863297 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:07.863270 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k494v_de3261e7-2587-464e-ac8f-c31c1d9d88e8/node-ca/0.log" Apr 17 14:28:14.313454 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:14.313380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:28:14.313934 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:14.313532 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:28:14.313934 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:14.313598 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs podName:1e20b346-d933-444b-947f-2bb4b05a5b07 nodeName:}" failed. No retries permitted until 2026-04-17 14:30:16.313581401 +0000 UTC m=+251.298488685 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs") pod "network-metrics-daemon-b4mhh" (UID: "1e20b346-d933-444b-947f-2bb4b05a5b07") : secret "metrics-daemon-secret" not found Apr 17 14:28:27.574683 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.574650 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7swq7"] Apr 17 14:28:27.577797 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.577778 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.580829 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.580807 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:28:27.580829 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.580826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:28:27.580987 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.580808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:28:27.580987 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.580807 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:28:27.581088 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.581022 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wp7x7\"" Apr 17 14:28:27.588199 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.588179 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7swq7"] Apr 17 14:28:27.616131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.616101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e687083-546e-415b-a585-899a3e577344-data-volume\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.616255 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.616136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdn9\" (UniqueName: \"kubernetes.io/projected/6e687083-546e-415b-a585-899a3e577344-kube-api-access-qvdn9\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.616255 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.616165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e687083-546e-415b-a585-899a3e577344-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.616255 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.616219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e687083-546e-415b-a585-899a3e577344-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.616255 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.616246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e687083-546e-415b-a585-899a3e577344-crio-socket\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.716806 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e687083-546e-415b-a585-899a3e577344-data-volume\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.716806 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdn9\" (UniqueName: \"kubernetes.io/projected/6e687083-546e-415b-a585-899a3e577344-kube-api-access-qvdn9\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717032 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e687083-546e-415b-a585-899a3e577344-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717032 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716845 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e687083-546e-415b-a585-899a3e577344-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717032 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e687083-546e-415b-a585-899a3e577344-crio-socket\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717032 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.716972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e687083-546e-415b-a585-899a3e577344-crio-socket\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717323 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.717304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e687083-546e-415b-a585-899a3e577344-data-volume\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.717408 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.717392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e687083-546e-415b-a585-899a3e577344-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.719208 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.719186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e687083-546e-415b-a585-899a3e577344-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.726771 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.726752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdn9\" (UniqueName: \"kubernetes.io/projected/6e687083-546e-415b-a585-899a3e577344-kube-api-access-qvdn9\") pod \"insights-runtime-extractor-7swq7\" (UID: \"6e687083-546e-415b-a585-899a3e577344\") " pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:27.739093 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.739047 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" podUID="4a34165c-c742-40e7-b117-8bc0046f32d4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:28:27.886975 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:27.886896 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7swq7" Apr 17 14:28:28.002972 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:28.002941 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7swq7"] Apr 17 14:28:28.006799 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:28:28.006766 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e687083_546e_415b_a585_899a3e577344.slice/crio-3a2d417053a7894399e63be67b454b03dd2da0c7112a519e4bfe6271e05eec53 WatchSource:0}: Error finding container 3a2d417053a7894399e63be67b454b03dd2da0c7112a519e4bfe6271e05eec53: Status 404 returned error can't find the container with id 3a2d417053a7894399e63be67b454b03dd2da0c7112a519e4bfe6271e05eec53 Apr 17 14:28:28.045108 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:28.045074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7swq7" event={"ID":"6e687083-546e-415b-a585-899a3e577344","Type":"ContainerStarted","Data":"3a2d417053a7894399e63be67b454b03dd2da0c7112a519e4bfe6271e05eec53"} Apr 17 14:28:29.048498 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:29.048391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7swq7" event={"ID":"6e687083-546e-415b-a585-899a3e577344","Type":"ContainerStarted","Data":"8cc79b1447cfdabbefda5c7cbc82db6eb0fc8472fd609acb562f52bfc9205c27"} Apr 17 14:28:29.048498 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:29.048443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7swq7" event={"ID":"6e687083-546e-415b-a585-899a3e577344","Type":"ContainerStarted","Data":"06e384e2d72f3a3f9e50e7c63137d8db23b4a1579abe5bb28cc8d118bd01b906"} Apr 17 14:28:31.055082 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:31.055041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7swq7" event={"ID":"6e687083-546e-415b-a585-899a3e577344","Type":"ContainerStarted","Data":"b2c74bc308ef71d4f7adc6bf2200ea359c7a14806d9913cf3d4ecdd8819c3154"} Apr 17 14:28:31.076711 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:31.076650 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7swq7" podStartSLOduration=1.905569936 podStartE2EDuration="4.076631443s" podCreationTimestamp="2026-04-17 14:28:27 +0000 UTC" firstStartedPulling="2026-04-17 14:28:28.06403484 +0000 UTC m=+143.048942124" lastFinishedPulling="2026-04-17 14:28:30.235096345 +0000 UTC m=+145.220003631" observedRunningTime="2026-04-17 14:28:31.07581391 +0000 UTC m=+146.060721217" watchObservedRunningTime="2026-04-17 14:28:31.076631443 +0000 UTC m=+146.061538751" Apr 17 14:28:37.738875 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:37.738834 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" podUID="4a34165c-c742-40e7-b117-8bc0046f32d4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:28:39.889227 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.889193 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n2pc7"] Apr 17 14:28:39.892385 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.892362 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.896988 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.896945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:28:39.897131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.897014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2m46t\"" Apr 17 14:28:39.897131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.896950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:28:39.897131 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.897122 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:28:39.897543 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.897525 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:28:39.897659 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.897635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:28:39.897724 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.897636 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:28:39.904976 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.904956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905078 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.904997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905078 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-metrics-client-ca\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905152 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-sys\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905152 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-root\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905247 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmcw9\" (UniqueName: \"kubernetes.io/projected/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-kube-api-access-dmcw9\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905247 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-textfile\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-wtmp\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:39.905393 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:39.905317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.005966 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.005932 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmcw9\" (UniqueName: \"kubernetes.io/projected/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-kube-api-access-dmcw9\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.005973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-textfile\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.005994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-wtmp\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-metrics-client-ca\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006119 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-sys\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-root\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-root\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.006226 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-wtmp\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006255 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-sys\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.006313 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls podName:7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:40.506290405 +0000 UTC m=+155.491197704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls") pod "node-exporter-n2pc7" (UID: "7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2") : secret "node-exporter-tls" not found Apr 17 14:28:40.006474 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-textfile\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006690 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.006727 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.006706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-metrics-client-ca\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.008386 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.008358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.014187 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.014168 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmcw9\" (UniqueName: \"kubernetes.io/projected/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-kube-api-access-dmcw9\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.402812 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.402775 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" Apr 17 14:28:40.480261 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.480217 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qfmj2" podUID="067a27fb-e850-42e3-8f46-7d062e8e4ac4" Apr 17 14:28:40.489408 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.489373 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-slgs9" podUID="e4663e8d-b134-432d-b180-efed510f0b7e" Apr 17 14:28:40.509892 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.509867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.512102 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.512077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2-node-exporter-tls\") pod \"node-exporter-n2pc7\" (UID: \"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2\") " pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.592328 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:28:40.592281 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-b4mhh" podUID="1e20b346-d933-444b-947f-2bb4b05a5b07" Apr 17 14:28:40.805863 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:40.805777 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n2pc7" Apr 17 14:28:40.814234 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:28:40.814203 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef53ff4_b680_4f2f_92c4_4028ac7e4ab2.slice/crio-7029bc7086a8a7b0f59e52ccd747c282ad0c57a879c54c80625f07d698ecdc3e WatchSource:0}: Error finding container 7029bc7086a8a7b0f59e52ccd747c282ad0c57a879c54c80625f07d698ecdc3e: Status 404 returned error can't find the container with id 7029bc7086a8a7b0f59e52ccd747c282ad0c57a879c54c80625f07d698ecdc3e Apr 17 14:28:41.080357 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:41.080275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-slgs9" Apr 17 14:28:41.080357 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:41.080285 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2pc7" event={"ID":"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2","Type":"ContainerStarted","Data":"7029bc7086a8a7b0f59e52ccd747c282ad0c57a879c54c80625f07d698ecdc3e"} Apr 17 14:28:41.080357 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:41.080287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:28:41.080887 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:41.080289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:28:42.084565 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:42.084529 2572 generic.go:358] "Generic (PLEG): container finished" podID="7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2" containerID="bcc7927162e8f53c26ef4bb39da8c5daf91beffa8b54d2f82a50fb275e6858df" exitCode=0 Apr 17 14:28:42.085035 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:42.084585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2pc7" event={"ID":"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2","Type":"ContainerDied","Data":"bcc7927162e8f53c26ef4bb39da8c5daf91beffa8b54d2f82a50fb275e6858df"} Apr 17 14:28:43.088724 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:43.088682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2pc7" event={"ID":"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2","Type":"ContainerStarted","Data":"a8996dfed9f806ac769067f4ba0681d386241d1ca706890bd75fd7787a45db6f"} Apr 17 14:28:43.088724 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:43.088727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n2pc7" event={"ID":"7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2","Type":"ContainerStarted","Data":"94ff10e96ccbec397fba8cedcac420d45d740ad1b47680769f29410e3ed243ca"} Apr 17 14:28:45.347807 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.347753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:28:45.350136 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.350114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"image-registry-d5f5d55cb-66jv6\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:28:45.449012 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.448975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:28:45.449128 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.449029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:28:45.451414 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.451391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4663e8d-b134-432d-b180-efed510f0b7e-metrics-tls\") pod \"dns-default-slgs9\" (UID: \"e4663e8d-b134-432d-b180-efed510f0b7e\") " pod="openshift-dns/dns-default-slgs9" Apr 17 14:28:45.451539 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.451449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067a27fb-e850-42e3-8f46-7d062e8e4ac4-cert\") pod \"ingress-canary-qfmj2\" (UID: \"067a27fb-e850-42e3-8f46-7d062e8e4ac4\") " pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:28:45.584763 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.584737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p2qzh\"" Apr 17 14:28:45.584942 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.584737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qz9tk\"" Apr 17 14:28:45.584942 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.584737 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gbrc2\"" Apr 17 14:28:45.592584 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.592565 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:28:45.592722 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.592637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmj2" Apr 17 14:28:45.592722 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.592661 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-slgs9" Apr 17 14:28:45.725972 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.725910 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n2pc7" podStartSLOduration=5.9727467910000005 podStartE2EDuration="6.725887044s" podCreationTimestamp="2026-04-17 14:28:39 +0000 UTC" firstStartedPulling="2026-04-17 14:28:40.815965933 +0000 UTC m=+155.800873217" lastFinishedPulling="2026-04-17 14:28:41.569106186 +0000 UTC m=+156.554013470" observedRunningTime="2026-04-17 14:28:43.10490688 +0000 UTC m=+158.089814186" watchObservedRunningTime="2026-04-17 14:28:45.725887044 +0000 UTC m=+160.710794350" Apr 17 14:28:45.726461 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.726420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-slgs9"] Apr 17 14:28:45.730412 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:28:45.730386 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4663e8d_b134_432d_b180_efed510f0b7e.slice/crio-87616794a3a864b359db16ab980cbdd85ca97b6bc495da8a5f8ea1870fe404de WatchSource:0}: Error finding container 87616794a3a864b359db16ab980cbdd85ca97b6bc495da8a5f8ea1870fe404de: Status 404 returned error can't find the container with id 87616794a3a864b359db16ab980cbdd85ca97b6bc495da8a5f8ea1870fe404de Apr 17 14:28:45.951846 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.951818 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmj2"] Apr 17 14:28:45.954606 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:28:45.954580 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067a27fb_e850_42e3_8f46_7d062e8e4ac4.slice/crio-f9f9752068d1679c2ed5b719768d53e943ea7ebfbe0b4dd857b2dcf039a344cc WatchSource:0}: Error finding container f9f9752068d1679c2ed5b719768d53e943ea7ebfbe0b4dd857b2dcf039a344cc: Status 404 returned error can't find the container with id f9f9752068d1679c2ed5b719768d53e943ea7ebfbe0b4dd857b2dcf039a344cc Apr 17 14:28:45.956966 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:45.956944 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:28:45.960069 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:28:45.960036 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744f381c_27ec_4d42_9ea2_2346a9303e65.slice/crio-383d4054e8127dc9a1602fea89af026a25675fca135563d1d9fee6ab3d8c7cec WatchSource:0}: Error finding container 383d4054e8127dc9a1602fea89af026a25675fca135563d1d9fee6ab3d8c7cec: Status 404 returned error can't find the container with id 383d4054e8127dc9a1602fea89af026a25675fca135563d1d9fee6ab3d8c7cec Apr 17 14:28:46.097843 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.097806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-slgs9" event={"ID":"e4663e8d-b134-432d-b180-efed510f0b7e","Type":"ContainerStarted","Data":"87616794a3a864b359db16ab980cbdd85ca97b6bc495da8a5f8ea1870fe404de"} Apr 17 14:28:46.098865 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.098834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfmj2" event={"ID":"067a27fb-e850-42e3-8f46-7d062e8e4ac4","Type":"ContainerStarted","Data":"f9f9752068d1679c2ed5b719768d53e943ea7ebfbe0b4dd857b2dcf039a344cc"} Apr 17 14:28:46.100140 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.100118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" event={"ID":"744f381c-27ec-4d42-9ea2-2346a9303e65","Type":"ContainerStarted","Data":"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af"} Apr 17 14:28:46.100140 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.100142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" event={"ID":"744f381c-27ec-4d42-9ea2-2346a9303e65","Type":"ContainerStarted","Data":"383d4054e8127dc9a1602fea89af026a25675fca135563d1d9fee6ab3d8c7cec"} Apr 17 14:28:46.100291 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.100268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:28:46.122174 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:46.122121 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" podStartSLOduration=162.122102627 podStartE2EDuration="2m42.122102627s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:28:46.121045557 +0000 UTC m=+161.105952896" watchObservedRunningTime="2026-04-17 14:28:46.122102627 +0000 UTC m=+161.107009933" Apr 17 14:28:47.721307 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.721270 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" podUID="1be854c3-3a12-46a0-9f6e-9e1fccba5835" containerName="addon-agent" probeResult="failure" output="Get \"http://10.133.0.6:8000/healthz\": dial tcp 10.133.0.6:8000: connect: connection refused" Apr 17 14:28:47.740114 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.740070 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" podUID="4a34165c-c742-40e7-b117-8bc0046f32d4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:28:47.740248 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.740146 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" Apr 17 14:28:47.740827 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.740782 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"55a3f944380f5d8c1359ac660ad148744f702556cbd18940ba390ebfe9a809ab"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 14:28:47.740931 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.740867 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" podUID="4a34165c-c742-40e7-b117-8bc0046f32d4" containerName="service-proxy" containerID="cri-o://55a3f944380f5d8c1359ac660ad148744f702556cbd18940ba390ebfe9a809ab" gracePeriod=30 Apr 17 14:28:47.768566 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.768523 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" podUID="39ea32e8-f57b-4997-8172-21d875d83841" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/healthz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 17 14:28:47.801994 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:47.801959 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" podUID="39ea32e8-f57b-4997-8172-21d875d83841" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/readyz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 17 14:28:48.107783 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.107697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-slgs9" event={"ID":"e4663e8d-b134-432d-b180-efed510f0b7e","Type":"ContainerStarted","Data":"3fcabd5b5f5af350d3e0dcfb905717fadf923e8f57f8683466b49f3448dfcc61"} Apr 17 14:28:48.107783 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.107737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-slgs9" event={"ID":"e4663e8d-b134-432d-b180-efed510f0b7e","Type":"ContainerStarted","Data":"651b6f2db3ed1ec17fd55417b8aba07cd92430e525ec157db79324c2612c643b"} Apr 17 14:28:48.107984 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.107832 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-slgs9" Apr 17 14:28:48.109006 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.108981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfmj2" event={"ID":"067a27fb-e850-42e3-8f46-7d062e8e4ac4","Type":"ContainerStarted","Data":"0cccf28ab59d7f0e6699a84507a783d2381638f1c229ac95081dc1ccd3b65df4"} Apr 17 14:28:48.110768 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.110745 2572 generic.go:358] "Generic (PLEG): container finished" podID="39ea32e8-f57b-4997-8172-21d875d83841" containerID="ffd76bb579fd5e00de86393a7fdd29bd1fe166e804608a38d88be8fb20850114" exitCode=1 Apr 17 14:28:48.110850 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.110824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" event={"ID":"39ea32e8-f57b-4997-8172-21d875d83841","Type":"ContainerDied","Data":"ffd76bb579fd5e00de86393a7fdd29bd1fe166e804608a38d88be8fb20850114"} Apr 17 14:28:48.111126 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.111104 2572 scope.go:117] "RemoveContainer" containerID="ffd76bb579fd5e00de86393a7fdd29bd1fe166e804608a38d88be8fb20850114" Apr 17 14:28:48.112852 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.112832 2572 generic.go:358] "Generic (PLEG): container finished" podID="4a34165c-c742-40e7-b117-8bc0046f32d4" containerID="55a3f944380f5d8c1359ac660ad148744f702556cbd18940ba390ebfe9a809ab" exitCode=2 Apr 17 14:28:48.112944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.112901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerDied","Data":"55a3f944380f5d8c1359ac660ad148744f702556cbd18940ba390ebfe9a809ab"} Apr 17 14:28:48.112944 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.112932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56b4dcf7ff-gqkkb" event={"ID":"4a34165c-c742-40e7-b117-8bc0046f32d4","Type":"ContainerStarted","Data":"9c63f601dff746c339d58d6affce96bd4a6828cd7f2c6a96c835de52c267bb55"} Apr 17 14:28:48.114389 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.114356 2572 generic.go:358] "Generic (PLEG): container finished" podID="1be854c3-3a12-46a0-9f6e-9e1fccba5835" containerID="94ce0ead4b704d6c8a07752c51d61c70e2710f50af42c0dd6908e79b176c46ab" exitCode=255 Apr 17 14:28:48.114511 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.114384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" event={"ID":"1be854c3-3a12-46a0-9f6e-9e1fccba5835","Type":"ContainerDied","Data":"94ce0ead4b704d6c8a07752c51d61c70e2710f50af42c0dd6908e79b176c46ab"} Apr 17 14:28:48.114708 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.114690 2572 scope.go:117] "RemoveContainer" containerID="94ce0ead4b704d6c8a07752c51d61c70e2710f50af42c0dd6908e79b176c46ab" Apr 17 14:28:48.127318 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.127272 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-slgs9" podStartSLOduration=129.267146982 podStartE2EDuration="2m11.127256062s" podCreationTimestamp="2026-04-17 14:26:37 +0000 UTC" firstStartedPulling="2026-04-17 14:28:45.732647687 +0000 UTC m=+160.717554974" lastFinishedPulling="2026-04-17 14:28:47.59275677 +0000 UTC m=+162.577664054" observedRunningTime="2026-04-17 14:28:48.125241112 +0000 UTC m=+163.110148429" watchObservedRunningTime="2026-04-17 14:28:48.127256062 +0000 UTC m=+163.112163369" Apr 17 14:28:48.143829 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:48.143784 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qfmj2" podStartSLOduration=129.504135658 podStartE2EDuration="2m11.143770782s" podCreationTimestamp="2026-04-17 14:26:37 +0000 UTC" firstStartedPulling="2026-04-17 14:28:45.956776905 +0000 UTC m=+160.941684189" lastFinishedPulling="2026-04-17 14:28:47.596412025 +0000 UTC m=+162.581319313" observedRunningTime="2026-04-17 14:28:48.143523713 +0000 UTC m=+163.128431020" watchObservedRunningTime="2026-04-17 14:28:48.143770782 +0000 UTC m=+163.128678087" Apr 17 14:28:49.119151 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:49.119115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" event={"ID":"39ea32e8-f57b-4997-8172-21d875d83841","Type":"ContainerStarted","Data":"42d04e967e8c02cd9b3dbf9e1bb030d6ac09bbf41a4b4787437ac8fddf344a34"} Apr 17 14:28:49.119634 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:49.119402 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:28:49.120511 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:49.120491 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f987cb679-5gdd6" Apr 17 14:28:49.120731 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:49.120712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-564656ff9f-8rvft" event={"ID":"1be854c3-3a12-46a0-9f6e-9e1fccba5835","Type":"ContainerStarted","Data":"23e71bdd78f7e0b8e76711108e6ec75d7f8e89dac4da3f915d3b2e0362d5cb37"} Apr 17 14:28:49.870588 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:49.870558 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:28:55.579591 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:55.579553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:28:58.123063 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:28:58.123032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-slgs9" Apr 17 14:29:09.876726 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:09.876697 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:29:14.889613 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:14.889558 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" containerName="registry" containerID="cri-o://70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af" gracePeriod=30 Apr 17 14:29:15.120217 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.120192 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:29:15.185751 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185667 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185751 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185738 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185768 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185798 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185826 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185866 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185889 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2wxb\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.185983 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.185923 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token\") pod \"744f381c-27ec-4d42-9ea2-2346a9303e65\" (UID: \"744f381c-27ec-4d42-9ea2-2346a9303e65\") " Apr 17 14:29:15.186335 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.186270 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:29:15.186708 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.186676 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:29:15.188776 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.188744 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:29:15.188875 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.188743 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb" (OuterVolumeSpecName: "kube-api-access-b2wxb") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "kube-api-access-b2wxb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:15.188875 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.188844 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:29:15.188974 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.188929 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:15.189274 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189252 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:15.189480 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189450 2572 generic.go:358] "Generic (PLEG): container finished" podID="744f381c-27ec-4d42-9ea2-2346a9303e65" containerID="70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af" exitCode=0 Apr 17 14:29:15.189556 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189526 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" Apr 17 14:29:15.189556 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" event={"ID":"744f381c-27ec-4d42-9ea2-2346a9303e65","Type":"ContainerDied","Data":"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af"} Apr 17 14:29:15.190605 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189908 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5f5d55cb-66jv6" event={"ID":"744f381c-27ec-4d42-9ea2-2346a9303e65","Type":"ContainerDied","Data":"383d4054e8127dc9a1602fea89af026a25675fca135563d1d9fee6ab3d8c7cec"} Apr 17 14:29:15.190605 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.189948 2572 scope.go:117] "RemoveContainer" containerID="70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af" Apr 17 14:29:15.199985 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.199956 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "744f381c-27ec-4d42-9ea2-2346a9303e65" (UID: "744f381c-27ec-4d42-9ea2-2346a9303e65"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:29:15.203001 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.202984 2572 scope.go:117] "RemoveContainer" containerID="70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af" Apr 17 14:29:15.203255 ip-10-0-130-190 kubenswrapper[2572]: E0417 14:29:15.203234 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af\": container with ID starting with 70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af not found: ID does not exist" containerID="70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af" Apr 17 14:29:15.203315 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.203267 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af"} err="failed to get container status \"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af\": rpc error: code = NotFound desc = could not find container \"70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af\": container with ID starting with 70e0e1bfbf2ab7e31cfd30a3464d959c30160962080cc343b66474b70bc9b1af not found: ID does not exist" Apr 17 14:29:15.286696 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286651 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/744f381c-27ec-4d42-9ea2-2346a9303e65-ca-trust-extracted\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286696 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286688 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-image-registry-private-configuration\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286696 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286699 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-certificates\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286709 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/744f381c-27ec-4d42-9ea2-2346a9303e65-trusted-ca\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286719 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/744f381c-27ec-4d42-9ea2-2346a9303e65-installation-pull-secrets\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286728 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-registry-tls\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286736 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2wxb\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-kube-api-access-b2wxb\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.286924 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.286745 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/744f381c-27ec-4d42-9ea2-2346a9303e65-bound-sa-token\") on node \"ip-10-0-130-190.ec2.internal\" DevicePath \"\"" Apr 17 14:29:15.509521 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.509494 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:29:15.512787 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.512762 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d5f5d55cb-66jv6"] Apr 17 14:29:15.580526 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:29:15.580497 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" path="/var/lib/kubelet/pods/744f381c-27ec-4d42-9ea2-2346a9303e65/volumes" Apr 17 14:30:16.316255 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:16.316213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:30:16.318559 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:16.318537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e20b346-d933-444b-947f-2bb4b05a5b07-metrics-certs\") pod \"network-metrics-daemon-b4mhh\" (UID: \"1e20b346-d933-444b-947f-2bb4b05a5b07\") " pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:30:16.583110 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:16.583030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w222h\"" Apr 17 14:30:16.590717 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:16.590694 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b4mhh" Apr 17 14:30:16.724538 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:16.724506 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b4mhh"] Apr 17 14:30:16.727577 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:30:16.727550 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e20b346_d933_444b_947f_2bb4b05a5b07.slice/crio-a4d6a9e9426151d2d85c8b99e88c997413db6c8eeca10b5850444b872897e323 WatchSource:0}: Error finding container a4d6a9e9426151d2d85c8b99e88c997413db6c8eeca10b5850444b872897e323: Status 404 returned error can't find the container with id a4d6a9e9426151d2d85c8b99e88c997413db6c8eeca10b5850444b872897e323 Apr 17 14:30:17.348311 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:17.348273 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b4mhh" event={"ID":"1e20b346-d933-444b-947f-2bb4b05a5b07","Type":"ContainerStarted","Data":"a4d6a9e9426151d2d85c8b99e88c997413db6c8eeca10b5850444b872897e323"} Apr 17 14:30:18.352344 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:18.352300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b4mhh" event={"ID":"1e20b346-d933-444b-947f-2bb4b05a5b07","Type":"ContainerStarted","Data":"1df8d89eab61b6d0134cc601094780230257bfbf2e098cd8a63eec7cd52da053"} Apr 17 14:30:18.352344 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:18.352342 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b4mhh" event={"ID":"1e20b346-d933-444b-947f-2bb4b05a5b07","Type":"ContainerStarted","Data":"ab84c68cbefb53a196664995b3a0766f4ee8856d14865150cef4b4c531e5a5b0"} Apr 17 14:30:18.367092 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:30:18.367045 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-b4mhh" podStartSLOduration=252.381710366 podStartE2EDuration="4m13.36703258s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:30:16.729379402 +0000 UTC m=+251.714286686" lastFinishedPulling="2026-04-17 14:30:17.714701613 +0000 UTC m=+252.699608900" observedRunningTime="2026-04-17 14:30:18.365608754 +0000 UTC m=+253.350516086" watchObservedRunningTime="2026-04-17 14:30:18.36703258 +0000 UTC m=+253.351939886" Apr 17 14:31:05.458379 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:31:05.458352 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:31:05.459158 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:31:05.459142 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:31:05.464637 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:31:05.464616 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:33:21.922601 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.922516 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hr4cb"] Apr 17 14:33:21.923061 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.922739 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" containerName="registry" Apr 17 14:33:21.923061 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.922748 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" containerName="registry" Apr 17 14:33:21.923061 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.922790 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="744f381c-27ec-4d42-9ea2-2346a9303e65" containerName="registry" Apr 17 14:33:21.925458 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.925425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:21.927767 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.927740 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:33:21.928573 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.928559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-rg85s\"" Apr 17 14:33:21.928629 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.928575 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:33:21.935004 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:21.934981 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hr4cb"] Apr 17 14:33:22.022702 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.022670 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6md4b\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-kube-api-access-6md4b\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.022879 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.022724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.123661 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.123636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6md4b\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-kube-api-access-6md4b\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.123795 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.123683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.131486 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.131456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6md4b\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-kube-api-access-6md4b\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.131591 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.131498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1eff5215-b922-4bbb-9961-aa74232fc792-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hr4cb\" (UID: \"1eff5215-b922-4bbb-9961-aa74232fc792\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.234387 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.234290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" Apr 17 14:33:22.349427 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.349392 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hr4cb"] Apr 17 14:33:22.352344 ip-10-0-130-190 kubenswrapper[2572]: W0417 14:33:22.352315 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eff5215_b922_4bbb_9961_aa74232fc792.slice/crio-4c9f29063ccf872e0b4eaa8a03807a3a0a631c0b295a5e7ec55a0577508109c4 WatchSource:0}: Error finding container 4c9f29063ccf872e0b4eaa8a03807a3a0a631c0b295a5e7ec55a0577508109c4: Status 404 returned error can't find the container with id 4c9f29063ccf872e0b4eaa8a03807a3a0a631c0b295a5e7ec55a0577508109c4 Apr 17 14:33:22.354040 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.354023 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:33:22.812220 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:22.812187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" event={"ID":"1eff5215-b922-4bbb-9961-aa74232fc792","Type":"ContainerStarted","Data":"4c9f29063ccf872e0b4eaa8a03807a3a0a631c0b295a5e7ec55a0577508109c4"} Apr 17 14:33:25.822057 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:25.821969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" event={"ID":"1eff5215-b922-4bbb-9961-aa74232fc792","Type":"ContainerStarted","Data":"8d5cfc180a2a699b8d1d09256ee09bdcba3e1387a21251d55760141983077f88"} Apr 17 14:33:25.836006 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:33:25.835948 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-hr4cb" podStartSLOduration=1.675785523 podStartE2EDuration="4.835927165s" podCreationTimestamp="2026-04-17 14:33:21 +0000 UTC" firstStartedPulling="2026-04-17 14:33:22.354153759 +0000 UTC m=+437.339061042" lastFinishedPulling="2026-04-17 14:33:25.514295386 +0000 UTC m=+440.499202684" observedRunningTime="2026-04-17 14:33:25.835580247 +0000 UTC m=+440.820487578" watchObservedRunningTime="2026-04-17 14:33:25.835927165 +0000 UTC m=+440.820834472" Apr 17 14:34:00.052945 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.052910 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn"] Apr 17 14:34:00.055895 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.055878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.059396 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.059376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 14:34:00.059535 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.059426 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 14:34:00.060114 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.060096 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 14:34:00.060212 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.060184 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:34:00.060273 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.060200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 14:34:00.060386 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.060372 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-2762q\"" Apr 17 14:34:00.066231 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.066210 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn"] Apr 17 14:34:00.201219 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.201167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-manager-config\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.201219 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.201223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-cert\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.201503 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.201288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mmx\" (UniqueName: \"kubernetes.io/projected/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-kube-api-access-46mmx\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.201503 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.201326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-metrics-certs\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.302509 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.302467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-manager-config\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.302509 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.302513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-cert\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.302770 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.302547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46mmx\" (UniqueName: \"kubernetes.io/projected/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-kube-api-access-46mmx\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.302770 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.302570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-metrics-certs\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.303315 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.303253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-manager-config\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.305446 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.305412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-cert\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.305624 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.305606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-metrics-certs\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.313325 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.313302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mmx\" (UniqueName: \"kubernetes.io/projected/2dfc3162-592b-4ad3-b570-1c2b14bc2a05-kube-api-access-46mmx\") pod \"jobset-controller-manager-5bdfcf4dbd-wmfnn\" (UID: \"2dfc3162-592b-4ad3-b570-1c2b14bc2a05\") " pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.364399 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.364351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:00.485369 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.485337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn"] Apr 17 14:34:00.910593 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:00.910549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" event={"ID":"2dfc3162-592b-4ad3-b570-1c2b14bc2a05","Type":"ContainerStarted","Data":"3f1a87cdc4665ba99dd8b2632cc8f4578fbf88665cae8ceab9368a82cbaff96b"} Apr 17 14:34:02.918331 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:02.918292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" event={"ID":"2dfc3162-592b-4ad3-b570-1c2b14bc2a05","Type":"ContainerStarted","Data":"d13d1172d2e9a1aa0c89eb5a65383ad58224e913c07fdc080213d00e19136db8"} Apr 17 14:34:02.918735 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:02.918392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:34:02.937618 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:02.937561 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" podStartSLOduration=1.318082787 podStartE2EDuration="2.937544133s" podCreationTimestamp="2026-04-17 14:34:00 +0000 UTC" firstStartedPulling="2026-04-17 14:34:00.493584654 +0000 UTC m=+475.478491938" lastFinishedPulling="2026-04-17 14:34:02.113045999 +0000 UTC m=+477.097953284" observedRunningTime="2026-04-17 14:34:02.93560784 +0000 UTC m=+477.920515150" watchObservedRunningTime="2026-04-17 14:34:02.937544133 +0000 UTC m=+477.922451438" Apr 17 14:34:13.926076 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:34:13.926044 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-5bdfcf4dbd-wmfnn" Apr 17 14:36:05.475602 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:36:05.475572 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:36:05.476128 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:36:05.475824 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:41:05.492191 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:41:05.492121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:41:05.492723 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:41:05.492327 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:46:05.507896 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:46:05.507868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:46:05.510270 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:46:05.509458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:51:05.528098 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:51:05.527984 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:51:05.533186 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:51:05.532645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:56:05.546400 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:56:05.546285 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 14:56:05.552420 ip-10-0-130-190 kubenswrapper[2572]: I0417 14:56:05.549527 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:01:05.566500 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:01:05.566376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:01:05.570415 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:01:05.569157 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:06:05.584089 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:05.583983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:06:05.587909 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:05.586786 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:06:14.968072 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:14.968038 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l4b6r_de388feb-3144-4718-889c-452bc20215d4/global-pull-secret-syncer/0.log" Apr 17 15:06:15.073279 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:15.073247 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-d6hck_29d21c65-516d-41f7-8313-d3fd5a97d74a/konnectivity-agent/0.log" Apr 17 15:06:15.146022 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:15.145991 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-190.ec2.internal_fd5f4115a37b7b2be21e64925c15d47d/haproxy/0.log" Apr 17 15:06:18.893464 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:18.893419 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2pc7_7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2/node-exporter/0.log" Apr 17 15:06:18.914164 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:18.914139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2pc7_7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2/kube-rbac-proxy/0.log" Apr 17 15:06:18.936600 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:18.936579 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n2pc7_7ef53ff4-b680-4f2f-92c4-4028ac7e4ab2/init-textfile/0.log" Apr 17 15:06:21.599485 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.599393 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb"] Apr 17 15:06:21.602346 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.602323 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.604687 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.604653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfmtr\"/\"kube-root-ca.crt\"" Apr 17 15:06:21.604793 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.604706 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfmtr\"/\"openshift-service-ca.crt\"" Apr 17 15:06:21.605285 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.605271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dfmtr\"/\"default-dockercfg-4dlrr\"" Apr 17 15:06:21.612322 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.612298 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb"] Apr 17 15:06:21.771712 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.771675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-sys\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.771712 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.771715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-lib-modules\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.771964 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.771739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-podres\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.771964 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.771823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5cnt\" (UniqueName: \"kubernetes.io/projected/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-kube-api-access-l5cnt\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.771964 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.771908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-proc\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873285 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-sys\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873285 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-lib-modules\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873285 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-podres\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5cnt\" (UniqueName: \"kubernetes.io/projected/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-kube-api-access-l5cnt\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-proc\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-sys\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873396 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-podres\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-proc\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.873575 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.873402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-lib-modules\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.881331 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.881302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5cnt\" (UniqueName: \"kubernetes.io/projected/6f5b9c3d-cb2f-49f6-bac4-c5f06845239f-kube-api-access-l5cnt\") pod \"perf-node-gather-daemonset-kkwmb\" (UID: \"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:21.912485 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:21.912453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:22.025400 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.025358 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb"] Apr 17 15:06:22.029721 ip-10-0-130-190 kubenswrapper[2572]: W0417 15:06:22.029693 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6f5b9c3d_cb2f_49f6_bac4_c5f06845239f.slice/crio-39d671ee81d5a876d1e77d665e0e548a18377f7d7da2b61413e760f071e081e1 WatchSource:0}: Error finding container 39d671ee81d5a876d1e77d665e0e548a18377f7d7da2b61413e760f071e081e1: Status 404 returned error can't find the container with id 39d671ee81d5a876d1e77d665e0e548a18377f7d7da2b61413e760f071e081e1 Apr 17 15:06:22.031274 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.031256 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:06:22.293976 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.293945 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-slgs9_e4663e8d-b134-432d-b180-efed510f0b7e/dns/0.log" Apr 17 15:06:22.310338 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.310312 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-slgs9_e4663e8d-b134-432d-b180-efed510f0b7e/kube-rbac-proxy/0.log" Apr 17 15:06:22.365897 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.365871 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l7r6t_e7e5fd15-2e9b-40a4-90da-1410a8f629bd/dns-node-resolver/0.log" Apr 17 15:06:22.795587 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.795559 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-k494v_de3261e7-2587-464e-ac8f-c31c1d9d88e8/node-ca/0.log" Apr 17 15:06:22.865391 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.865355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" event={"ID":"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f","Type":"ContainerStarted","Data":"c92510079c497230d5ee69fe12f4120be67bec480efdb411213cff7c9cc11da7"} Apr 17 15:06:22.865391 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.865390 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" event={"ID":"6f5b9c3d-cb2f-49f6-bac4-c5f06845239f","Type":"ContainerStarted","Data":"39d671ee81d5a876d1e77d665e0e548a18377f7d7da2b61413e760f071e081e1"} Apr 17 15:06:22.865604 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:22.865477 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:23.752708 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:23.752678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qfmj2_067a27fb-e850-42e3-8f46-7d062e8e4ac4/serve-healthcheck-canary/0.log" Apr 17 15:06:24.095566 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:24.095485 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7swq7_6e687083-546e-415b-a585-899a3e577344/kube-rbac-proxy/0.log" Apr 17 15:06:24.112932 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:24.112900 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7swq7_6e687083-546e-415b-a585-899a3e577344/exporter/0.log" Apr 17 15:06:24.130246 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:24.130222 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7swq7_6e687083-546e-415b-a585-899a3e577344/extractor/0.log" Apr 17 15:06:25.721184 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:25.721150 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-5bdfcf4dbd-wmfnn_2dfc3162-592b-4ad3-b570-1c2b14bc2a05/manager/0.log" Apr 17 15:06:28.878470 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:28.878419 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" Apr 17 15:06:28.892318 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:28.892244 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-kkwmb" podStartSLOduration=7.892227326 podStartE2EDuration="7.892227326s" podCreationTimestamp="2026-04-17 15:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:06:22.878972054 +0000 UTC m=+2417.863879361" watchObservedRunningTime="2026-04-17 15:06:28.892227326 +0000 UTC m=+2423.877134647" Apr 17 15:06:29.605647 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.605615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/kube-multus-additional-cni-plugins/0.log" Apr 17 15:06:29.623217 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.623193 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/egress-router-binary-copy/0.log" Apr 17 15:06:29.640202 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.640182 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/cni-plugins/0.log" Apr 17 15:06:29.658085 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.658063 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/bond-cni-plugin/0.log" Apr 17 15:06:29.675153 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.675133 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/routeoverride-cni/0.log" Apr 17 15:06:29.691716 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.691698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/whereabouts-cni-bincopy/0.log" Apr 17 15:06:29.710506 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:29.710483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65fjv_f1097dd8-2309-4ac7-ae1d-b1ca093e2063/whereabouts-cni/0.log" Apr 17 15:06:30.083485 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:30.083459 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-brlgp_70b0c40d-084b-491c-8390-f199b025b91b/kube-multus/0.log" Apr 17 15:06:30.131578 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:30.131549 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-b4mhh_1e20b346-d933-444b-947f-2bb4b05a5b07/network-metrics-daemon/0.log" Apr 17 15:06:30.149801 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:30.149739 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-b4mhh_1e20b346-d933-444b-947f-2bb4b05a5b07/kube-rbac-proxy/0.log" Apr 17 15:06:31.303303 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.303235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-controller/0.log" Apr 17 15:06:31.319471 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.319448 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/0.log" Apr 17 15:06:31.329123 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.329094 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovn-acl-logging/1.log" Apr 17 15:06:31.345542 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.345509 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/kube-rbac-proxy-node/0.log" Apr 17 15:06:31.361544 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.361518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 15:06:31.376573 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.376553 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/northd/0.log" Apr 17 15:06:31.392880 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.392858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/nbdb/0.log" Apr 17 15:06:31.409326 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.409291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/sbdb/0.log" Apr 17 15:06:31.489153 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:31.489124 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nck2f_c607d8f0-4652-40d6-a3b8-74f2c8fcc998/ovnkube-controller/0.log" Apr 17 15:06:32.731635 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:32.731602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d4f88_479a0d66-ba09-406a-9da8-b98589e81608/network-check-target-container/0.log" Apr 17 15:06:33.612062 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:33.612027 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lcdd5_12f3294c-ef76-4702-8d03-5991c66cadb2/iptables-alerter/0.log" Apr 17 15:06:34.182201 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:34.182171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wmdfs_a4d3e998-5f09-49e6-aecd-b23ad2e3ba0d/tuned/0.log" Apr 17 15:06:37.245957 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:37.245918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-8zfj2_fc19c040-93e2-4007-93c1-ee24954d0d5a/csi-driver/0.log" Apr 17 15:06:37.263214 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:37.263185 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-8zfj2_fc19c040-93e2-4007-93c1-ee24954d0d5a/csi-node-driver-registrar/0.log" Apr 17 15:06:37.281156 ip-10-0-130-190 kubenswrapper[2572]: I0417 15:06:37.281130 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-8zfj2_fc19c040-93e2-4007-93c1-ee24954d0d5a/csi-liveness-probe/0.log"