Apr 22 19:23:04.977291 ip-10-0-141-16 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:05.374399 ip-10-0-141-16 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.374399 ip-10-0-141-16 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:05.374399 ip-10-0-141-16 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.374399 ip-10-0-141-16 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:05.374399 ip-10-0-141-16 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:05.375547 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.375127 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:05.377955 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377938 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.377955 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377953 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.377955 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377957 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377961 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377964 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377968 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377971 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377973 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377976 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377979 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377982 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377985 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377989 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377991 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377995 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.377997 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378014 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378017 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378019 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378022 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378025 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378028 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.378059 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378031 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378033 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378036 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378039 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378041 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378044 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378047 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378049 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378051 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378054 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378056 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378059 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378061 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378064 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378068 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378071 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378073 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378077 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378080 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378082 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.378537 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378085 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378088 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378090 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378093 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378095 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378098 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378100 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378102 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378106 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378111 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378113 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378116 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378118 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378121 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378125 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378129 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378133 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378138 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378141 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.379063 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378144 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378147 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378150 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378153 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378155 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378159 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378161 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378164 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378167 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378169 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378172 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378174 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378177 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378179 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378183 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378185 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378188 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378191 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378193 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378196 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.379527 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378199 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378201 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378204 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378207 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378209 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378597 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378602 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378606 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378609 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378612 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378614 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378617 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378620 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378623 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378625 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378628 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378631 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378634 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378637 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.380066 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378639 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378642 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378644 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378647 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378650 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378653 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378658 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378661 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378663 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378666 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378668 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378671 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378673 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378675 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378678 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378681 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378684 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378686 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378689 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378691 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.380532 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378695 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378697 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378700 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378703 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378705 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378708 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378710 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378713 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378715 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378718 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378720 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378723 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378725 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378728 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378730 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378733 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378735 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378738 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378741 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378743 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.381054 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378745 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378748 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378750 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378753 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378755 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378757 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378760 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378762 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378765 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378767 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378770 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378774 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378777 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378780 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378782 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378785 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378802 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378806 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378810 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378813 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.381557 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378815 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378818 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378820 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378823 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378827 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378829 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378832 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378835 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378837 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378840 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378842 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.378846 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380098 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380107 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380115 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380119 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380124 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380128 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380132 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380137 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:05.382064 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380140 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380143 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380146 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380151 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380154 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380157 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380160 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380163 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380166 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380169 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380172 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380176 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380179 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380182 2578 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380185 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380188 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380192 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380196 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380200 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380203 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380206 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380209 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380212 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380215 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380218 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:05.382559 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380222 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380225 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380228 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380231 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380234 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380237 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380242 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380245 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380248 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380251 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380254 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380262 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380265 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380268 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380272 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380275 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380277 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380280 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380283 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380286 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380289 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380292 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380296 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380299 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380302 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:05.383191 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380306 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380309 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380312 2578 flags.go:64] FLAG: --help="false" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380315 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380318 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380321 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380324 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380327 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380330 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380333 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380336 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380339 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380342 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380344 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380347 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380350 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380353 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380356 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380361 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380364 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380367 2578 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380369 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380372 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380375 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:05.383800 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380381 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380384 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380387 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380390 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380393 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380396 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380399 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380402 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380406 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380410 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380414 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380417 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380420 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380423 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380426 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380429 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380431 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380434 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380442 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380446 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380448 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380451 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380454 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:05.384433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380460 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380463 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380466 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380469 2578 flags.go:64] FLAG: --port="10250" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380474 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380477 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c52827f819792a5b" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380480 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380483 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380486 2578 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380489 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380492 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380501 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380504 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380506 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380509 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380513 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380516 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380519 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380522 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380525 2578 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380528 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380531 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380535 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380538 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380541 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380544 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:05.385042 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380547 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380552 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380555 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380558 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380560 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380563 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380566 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380569 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380573 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380578 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380582 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380585 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380589 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380591 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380594 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380597 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380600 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380603 2578 flags.go:64] FLAG: --v="2" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380608 2578 flags.go:64] FLAG: --version="false" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380612 2578 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380617 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.380620 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380709 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380714 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.385761 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380717 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380720 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380724 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380726 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380729 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380732 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380735 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380738 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380740 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380745 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380747 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380750 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380753 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380756 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380758 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380761 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380763 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380766 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380769 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380772 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.386356 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380775 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380777 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380780 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380782 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380785 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380787 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380790 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380793 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380795 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380798 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380800 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380803 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380805 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380807 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380810 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380813 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380815 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380818 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380821 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380823 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.386899 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380826 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380830 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380832 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380835 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380837 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380841 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380845 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380847 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380850 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380852 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380855 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380857 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380860 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380863 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380865 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380868 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380870 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380873 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380875 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.387414 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380878 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380880 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380882 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380885 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380888 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380890 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380892 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380895 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380897 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380900 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380903 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380905 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380908 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380912 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380916 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380919 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380922 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380924 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380927 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.387880 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380929 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380932 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380934 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380937 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380940 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.380942 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.381608 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.387714 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.387730 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387776 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387781 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387784 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387786 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387789 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387792 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387795 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.388431 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387798 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387801 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387804 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387806 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387809 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387812 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387816 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387819 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387822 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387824 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387827 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387830 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387833 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387835 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387838 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387841 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387844 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387846 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387850 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387854 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.388869 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387858 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387861 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387864 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387867 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387870 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387874 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387878 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387880 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387883 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387886 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387889 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387892 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387894 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387897 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387899 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387902 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387905 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387907 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387910 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.389374 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387913 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387915 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387918 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387921 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387923 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387927 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387929 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387932 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387935 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387937 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387940 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387943 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387946 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387948 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387951 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387953 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387956 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387959 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387961 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387964 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.389847 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387966 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387969 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387971 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387974 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387977 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387979 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387981 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387984 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387986 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387989 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387991 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387994 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.387997 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388018 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388023 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388027 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388031 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388035 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388038 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.390367 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388041 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.388047 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388146 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388151 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388154 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388157 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388160 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388163 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388166 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388169 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388172 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388175 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388178 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388181 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388184 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:05.390831 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388187 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388190 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388193 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388195 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388198 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388200 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388203 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388206 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388208 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388211 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388213 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388216 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388219 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388221 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388225 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388230 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388233 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388235 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388238 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:05.391256 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388241 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388243 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388246 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388248 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388251 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388254 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388256 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388259 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388262 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388266 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388269 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388271 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388274 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388277 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388279 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388282 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388285 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388287 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388290 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388292 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:05.391729 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388294 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388297 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388299 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388302 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388304 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388307 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388309 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388312 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388315 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388317 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388320 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388322 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388325 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388327 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388329 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388332 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388335 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388337 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388340 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388342 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:05.392280 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388345 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388347 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388349 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388352 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388354 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388357 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388359 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388362 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388365 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388368 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388370 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388372 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388375 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:05.388377 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.388382 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:05.392771 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.389047 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:05.393156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.390797 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:05.393156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.391822 2578 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:05.393156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.391923 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:05.393156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.391958 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:05.414347 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.414327 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:05.419626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.419600 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:05.434909 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.434890 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:05.440051 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.440032 2578 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:05.440753 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.440736 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:05.441557 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.441545 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:05.445774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.445748 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c25d7769-1bdb-4a96-9250-699438b1ef8d:/dev/nvme0n1p3 e43cad96-18b9-4cd1-9bc8-414d8293c0bb:/dev/nvme0n1p4] Apr 22 19:23:05.445851 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.445773 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:05.451972 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.451865 2578 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:05.450172327 +0000 UTC m=+0.366226599 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099871 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec246c99957a64de50d88ef43051a3e6 SystemUUID:ec246c99-957a-64de-50d8-8ef43051a3e6 BootID:23d2cae2-578e-4cfb-a6d7-4128b1a37611 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e5:56:d4:e6:2b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e5:56:d4:e6:2b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:60:21:73:02:23 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:05.451972 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.451968 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:05.452088 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.452066 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:05.453097 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453060 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:05.453241 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453099 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:05.453288 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453251 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:05.453288 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453260 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:05.453288 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453277 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:05.453991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.453980 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:05.454668 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.454659 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:05.454949 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.454939 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:05.456900 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.456889 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:05.456938 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.456909 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:05.456938 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.456920 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:05.456938 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.456930 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:05.457066 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.456939 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:05.457983 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.457970 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:05.458046 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.457989 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:05.460726 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.460703 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:05.461858 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.461845 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:05.464286 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464270 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464303 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464310 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464317 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464328 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464334 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464341 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464348 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464355 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:05.464362 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464362 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:05.464610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464372 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:05.464610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464381 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:05.464610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464406 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:05.464610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.464412 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:05.468273 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.468257 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:05.468353 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.468308 2578 server.go:1295] "Started kubelet" Apr 22 19:23:05.468415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.468390 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:05.468520 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.468471 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:05.468575 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.468542 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:05.469200 ip-10-0-141-16 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:05.470195 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.470177 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:05.471962 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.471942 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:05.474277 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.474237 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-16.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:05.474370 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.474285 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:05.474370 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.474327 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:05.475014 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.474223 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-16.ec2.internal.18a8c4352146f945 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-16.ec2.internal,UID:ip-10-0-141-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-16.ec2.internal,},FirstTimestamp:2026-04-22 19:23:05.468270917 +0000 UTC m=+0.384325185,LastTimestamp:2026-04-22 19:23:05.468270917 +0000 UTC m=+0.384325185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-16.ec2.internal,}" Apr 22 19:23:05.475638 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.475618 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:05.475638 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.475629 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:05.476723 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476289 2578 factory.go:55] Registering systemd factory Apr 22 19:23:05.476723 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.476337 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.476723 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476351 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476732 2578 factory.go:153] Registering CRI-O factory Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476748 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476797 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476822 2578 factory.go:103] Registering Raw factory Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476838 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476915 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:05.476991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.476944 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:05.477324 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.477148 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:05.477324 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.477159 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:05.477405 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.477322 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:05.477405 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.477357 2578 manager.go:319] Starting recovery of all containers Apr 22 19:23:05.477501 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.477407 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:05.482393 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.482210 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:05.482501 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.482477 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:05.482615 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.482593 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8j5c" Apr 22 19:23:05.487641 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.487619 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8j5c" Apr 22 19:23:05.488861 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.488842 2578 manager.go:324] Recovery completed Apr 22 19:23:05.490581 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.490561 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:23:05.493560 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.493548 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.495807 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.495794 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.495854 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.495822 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.495854 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.495831 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.496330 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.496317 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:05.496330 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.496330 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:05.496450 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.496348 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:05.498274 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.498215 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-16.ec2.internal.18a8c43522eb2a33 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-16.ec2.internal,UID:ip-10-0-141-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-16.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-16.ec2.internal,},FirstTimestamp:2026-04-22 19:23:05.495808563 +0000 UTC m=+0.411862831,LastTimestamp:2026-04-22 19:23:05.495808563 +0000 UTC m=+0.411862831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-16.ec2.internal,}" Apr 22 19:23:05.499224 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.499211 2578 policy_none.go:49] "None policy: Start" Apr 22 19:23:05.499292 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.499230 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:05.499292 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.499244 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:05.531195 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531180 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.531221 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531234 2578 server.go:85] "Starting device plugin registration server" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531487 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531500 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531597 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531692 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.531701 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.532167 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:05.559212 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.532209 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.573289 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.573256 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:05.574546 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.574525 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:05.574546 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.574548 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:05.574675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.574574 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:05.574675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.574581 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:05.574675 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.574611 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:05.576568 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.576552 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:05.632248 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.632176 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.633352 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.633337 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.633428 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.633369 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.633428 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.633379 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.633428 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.633405 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.641824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.641812 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.641865 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.641831 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-16.ec2.internal\": node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.657465 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.657441 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.674967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.674935 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal"] Apr 22 19:23:05.675049 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.675039 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.675864 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.675849 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.675930 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.675877 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.675930 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.675891 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.678284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678271 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.678415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.678490 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678435 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.678956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678939 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.679044 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678971 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.679044 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678982 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.679044 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.678940 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.679044 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.679038 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.679181 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.679051 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.681211 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.681194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.681293 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.681226 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:05.681853 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.681839 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:05.681919 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.681861 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:05.681919 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.681876 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:05.693547 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.693526 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.696855 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.696841 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.758360 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.758330 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.778535 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.778505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.778535 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.778535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.778687 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.778553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.858665 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.858629 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.879012 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.878975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.879082 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.879017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.879082 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.879035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.879082 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.879073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.879174 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.879080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.879174 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.879073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.959293 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:05.959219 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:05.994698 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.994665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:05.999037 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:05.999019 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:06.059317 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.059281 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.159811 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.159780 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.260485 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.260404 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.361016 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.360969 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.391555 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.391533 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:06.392197 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.391681 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:06.461077 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.461050 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.470593 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.470565 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.475861 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.475844 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:06.484683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.484664 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:06.491398 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.491370 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:05 +0000 UTC" deadline="2027-11-27 09:33:44.501500755 +0000 UTC" Apr 22 19:23:06.491398 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.491396 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14006h10m38.010107213s" Apr 22 19:23:06.510440 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.510421 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqwmj" Apr 22 19:23:06.518864 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.518822 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqwmj" Apr 22 19:23:06.562201 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:06.562173 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 22 19:23:06.567039 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.567019 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.602104 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.602077 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:06.668981 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:06.668951 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod237efac7542ae805317afa8331e5e27b.slice/crio-bc7278778b1cb77cb684f829ed2356ef4006b2fa609e32d41dff473fecca094a WatchSource:0}: Error finding container bc7278778b1cb77cb684f829ed2356ef4006b2fa609e32d41dff473fecca094a: Status 404 returned error can't find the container with id bc7278778b1cb77cb684f829ed2356ef4006b2fa609e32d41dff473fecca094a Apr 22 19:23:06.669437 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:06.669417 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d97c8c240a436d06b1c4f45cd224be.slice/crio-6d0d671dadff1268b42e24b7f9800b1967b54983a6ba2eee24e8925af4a255c9 WatchSource:0}: Error finding container 6d0d671dadff1268b42e24b7f9800b1967b54983a6ba2eee24e8925af4a255c9: Status 404 returned error can't find the container with id 6d0d671dadff1268b42e24b7f9800b1967b54983a6ba2eee24e8925af4a255c9 Apr 22 19:23:06.674039 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.674026 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:06.676663 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.676648 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 22 19:23:06.689091 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.689070 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:06.689970 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.689956 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 22 19:23:06.698624 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:06.698508 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:07.457306 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.457274 2578 apiserver.go:52] "Watching apiserver" Apr 22 19:23:07.464872 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.464839 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:07.466086 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.466057 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rgfwb","kube-system/konnectivity-agent-jth69","kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q","openshift-cluster-node-tuning-operator/tuned-wr9b8","openshift-dns/node-resolver-rkjln","openshift-multus/multus-66jvk","openshift-multus/network-metrics-daemon-czpht","openshift-image-registry/node-ca-47vkz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal","openshift-multus/multus-additional-cni-plugins-grm6t","openshift-network-diagnostics/network-check-target-spzjt","openshift-network-operator/iptables-alerter-mncb2","openshift-ovn-kubernetes/ovnkube-node-cz572"] Apr 22 19:23:07.469205 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.469184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.471294 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.471269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.471797 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.471769 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:07.471890 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.471781 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:07.471890 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.471871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.472034 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.472017 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wz2bv\"" Apr 22 19:23:07.472104 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.472076 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.473332 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.473270 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.473501 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.473483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.473757 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.473735 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.474166 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.474034 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:07.474505 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.474411 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gk94m\"" Apr 22 19:23:07.475680 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.475658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4bwq8\"" Apr 22 19:23:07.476284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:07.476284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476092 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.476284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476137 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.476284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.476751 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476630 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:07.476751 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.476704 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:07.477222 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.477202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:07.478661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.478485 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.478752 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.478724 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:07.478818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.478802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8hszl\"" Apr 22 19:23:07.478867 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.478850 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.481335 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.481040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.483663 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.483173 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.483663 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.483197 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.483663 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.483424 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2hncx\"" Apr 22 19:23:07.483663 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.483532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.486224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.486357 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.486627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cdc22288-9935-403f-8e99-11cb3daf1c99-iptables-alerter-script\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-netns\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-netns\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-log-socket\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-sys-fs\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6lbx\" (UniqueName: \"kubernetes.io/projected/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kube-api-access-d6lbx\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-kubernetes\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.487977 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-conf\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.487988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-systemd-units\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-var-lib-kubelet\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-bin\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-netd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-run\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488244 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-etc-tuned\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-system-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-cni-binary-copy\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8m9d\" (UniqueName: \"kubernetes.io/projected/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-kube-api-access-m8m9d\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-multus-certs\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvrs8\" (UniqueName: \"kubernetes.io/projected/cdc22288-9935-403f-8e99-11cb3daf1c99-kube-api-access-pvrs8\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.488607 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-registration-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-device-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-etc-kubernetes\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-config\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-etc-selinux\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-sys\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-lib-modules\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9rc5\" (UniqueName: \"kubernetes.io/projected/63bd826c-675d-4901-ac56-91d345994e80-kube-api-access-f9rc5\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-etc-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.488968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-env-overrides\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-systemd\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdc22288-9935-403f-8e99-11cb3daf1c99-host-slash\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-os-release\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-hostroot\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-ovn\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.489429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-script-lib\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-host\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-multus-daemon-config\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-node-log\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysconfig\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-kubelet\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-tmp\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-k8s-cni-cncf-io\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-conf-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-slash\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-multus\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.489959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-socket-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-modprobe-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.490219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-bin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-kubelet\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5ww\" (UniqueName: \"kubernetes.io/projected/f5fd4978-3887-4451-945f-2523ac01e21d-kube-api-access-sr5ww\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-systemd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-var-lib-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovn-node-metrics-cert\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-cnibin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-socket-dir-parent\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xfjtr\"" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490541 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490801 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:07.490967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.490365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.491622 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.491496 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:07.491622 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.491516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nzrtl\"" Apr 22 19:23:07.491622 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.491521 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:07.493412 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.492996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.495246 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.495199 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:07.495403 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.495386 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:07.495476 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.495413 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qn599\"" Apr 22 19:23:07.497986 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.497967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:07.498094 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.498060 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:07.500628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.500227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.500628 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.500287 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:07.500628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.500317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.502745 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.502647 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:07.502745 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.502652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:07.502891 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.502755 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-78k8d\"" Apr 22 19:23:07.520487 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.520460 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:06 +0000 UTC" deadline="2028-01-05 21:34:28.691533554 +0000 UTC" Apr 22 19:23:07.520487 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.520487 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14954h11m21.17104911s" Apr 22 19:23:07.578963 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.578941 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:07.579922 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.579879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"6d0d671dadff1268b42e24b7f9800b1967b54983a6ba2eee24e8925af4a255c9"} Apr 22 19:23:07.581558 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.581533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"bc7278778b1cb77cb684f829ed2356ef4006b2fa609e32d41dff473fecca094a"} Apr 22 19:23:07.590764 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-multus-daemon-config\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.590887 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-cnibin\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.590887 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-node-log\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.590887 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysconfig\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.591073 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrbd\" (UniqueName: \"kubernetes.io/projected/ef0d702b-9f81-4046-b801-085bdfdf12b5-kube-api-access-blrbd\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.591073 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-node-log\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591073 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.590989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ef064a5-78b1-49a5-a46f-8d155af983ba-host\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.591073 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysconfig\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.591232 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-kubelet\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591232 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.591232 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-tmp\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.591340 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-k8s-cni-cncf-io\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591340 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-conf-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591340 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-slash\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591340 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-multus\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-socket-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.591466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-modprobe-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.591466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-bin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-kubelet\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-multus-daemon-config\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5ww\" (UniqueName: \"kubernetes.io/projected/f5fd4978-3887-4451-945f-2523ac01e21d-kube-api-access-sr5ww\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-dbus\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-systemd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-slash\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591562 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-k8s-cni-cncf-io\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-conf-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-bin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-kubelet\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.591682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-kubelet\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-var-lib-cni-multus\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-var-lib-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovn-node-metrics-cert\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-socket-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-cnibin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-var-lib-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-modprobe-d\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-socket-dir-parent\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-systemd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-system-cni-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-cnibin\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591966 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57a867c3-e773-4882-a1b2-dc753d0d39ef-konnectivity-ca\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-socket-dir-parent\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cdc22288-9935-403f-8e99-11cb3daf1c99-iptables-alerter-script\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.591984 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-netns\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592095 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-netns\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-netns\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-log-socket\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-run-netns\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-sys-fs\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-log-socket\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6lbx\" (UniqueName: \"kubernetes.io/projected/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kube-api-access-d6lbx\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-sys-fs\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-kubernetes\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-conf\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-systemd-units\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592398 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-kubernetes\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-systemd-units\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-var-lib-kubelet\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.592791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-var-lib-kubelet\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.593552 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cdc22288-9935-403f-8e99-11cb3daf1c99-iptables-alerter-script\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.593552 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-sysctl-conf\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.593552 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-os-release\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.597683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.592541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-multus-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.598103 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.598187 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovn-node-metrics-cert\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598187 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-bin\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598187 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-bin\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598326 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-tmp\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.598326 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-netd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598326 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598302 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-cni-netd\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-run\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.598449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-run\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.598449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-etc-tuned\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.598580 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-system-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.598580 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-cni-binary-copy\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.598675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.598675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-system-cni-dir\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.598675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.598812 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8m9d\" (UniqueName: \"kubernetes.io/projected/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-kube-api-access-m8m9d\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-multus-certs\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ef064a5-78b1-49a5-a46f-8d155af983ba-serviceca\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.598979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvrs8\" (UniqueName: \"kubernetes.io/projected/cdc22288-9935-403f-8e99-11cb3daf1c99-kube-api-access-pvrs8\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4dh\" (UniqueName: \"kubernetes.io/projected/6ef064a5-78b1-49a5-a46f-8d155af983ba-kube-api-access-5b4dh\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-registration-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-device-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-etc-kubernetes\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5fd4978-3887-4451-945f-2523ac01e21d-cni-binary-copy\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-registration-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-config\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.599594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-etc-selinux\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-etc-kubernetes\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599677 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-device-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.599716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-host-run-multus-certs\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-sys\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-etc-selinux\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-lib-modules\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-sys\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.600308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9rc5\" (UniqueName: \"kubernetes.io/projected/63bd826c-675d-4901-ac56-91d345994e80-kube-api-access-f9rc5\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.600721 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-config\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.600768 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63bd826c-675d-4901-ac56-91d345994e80-etc-tuned\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.600984 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.600947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-lib-modules\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.601196 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-kubelet-config\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.601279 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.601279 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg5w\" (UniqueName: \"kubernetes.io/projected/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-kube-api-access-4vg5w\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.601378 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-etc-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601378 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-env-overrides\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601378 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-systemd\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.601527 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.601527 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-etc-openvswitch\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxpb\" (UniqueName: \"kubernetes.io/projected/ddd68968-e706-4765-85c0-cc5f617ffb19-kube-api-access-ksxpb\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.601626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57a867c3-e773-4882-a1b2-dc753d0d39ef-agent-certs\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.601626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdc22288-9935-403f-8e99-11cb3daf1c99-host-slash\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.601763 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-os-release\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.601763 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-hostroot\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.601763 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601700 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:07.601763 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ddd68968-e706-4765-85c0-cc5f617ffb19-hosts-file\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.601946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddd68968-e706-4765-85c0-cc5f617ffb19-tmp-dir\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.601946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-ovn\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-script-lib\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-run-ovn\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.601946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-os-release\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.601936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-host\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-etc-systemd\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602124 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63bd826c-675d-4901-ac56-91d345994e80-host\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdc22288-9935-403f-8e99-11cb3daf1c99-host-slash\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.602307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602253 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-env-overrides\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.602609 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.602316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f5fd4978-3887-4451-945f-2523ac01e21d-hostroot\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.603110 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.603086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-ovnkube-script-lib\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.604752 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.604693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5ww\" (UniqueName: \"kubernetes.io/projected/f5fd4978-3887-4451-945f-2523ac01e21d-kube-api-access-sr5ww\") pod \"multus-66jvk\" (UID: \"f5fd4978-3887-4451-945f-2523ac01e21d\") " pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.605824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.605776 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6lbx\" (UniqueName: \"kubernetes.io/projected/3ffe4dbc-a0ce-4c8c-a232-6c23ed952136-kube-api-access-d6lbx\") pod \"aws-ebs-csi-driver-node-rpr9q\" (UID: \"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.608250 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.608227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8m9d\" (UniqueName: \"kubernetes.io/projected/721dc9c4-46d2-43f9-960d-7a7ecd3081a9-kube-api-access-m8m9d\") pod \"ovnkube-node-cz572\" (UID: \"721dc9c4-46d2-43f9-960d-7a7ecd3081a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.608490 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.608456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvrs8\" (UniqueName: \"kubernetes.io/projected/cdc22288-9935-403f-8e99-11cb3daf1c99-kube-api-access-pvrs8\") pod \"iptables-alerter-mncb2\" (UID: \"cdc22288-9935-403f-8e99-11cb3daf1c99\") " pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.609199 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.609179 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9rc5\" (UniqueName: \"kubernetes.io/projected/63bd826c-675d-4901-ac56-91d345994e80-kube-api-access-f9rc5\") pod \"tuned-wr9b8\" (UID: \"63bd826c-675d-4901-ac56-91d345994e80\") " pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.703334 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-kubelet-config\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.703521 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.703521 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-kubelet-config\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.703521 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.703452 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.703521 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg5w\" (UniqueName: \"kubernetes.io/projected/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-kube-api-access-4vg5w\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.703521 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.703546 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:08.203498418 +0000 UTC m=+3.119552677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxpb\" (UniqueName: \"kubernetes.io/projected/ddd68968-e706-4765-85c0-cc5f617ffb19-kube-api-access-ksxpb\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57a867c3-e773-4882-a1b2-dc753d0d39ef-agent-certs\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ddd68968-e706-4765-85c0-cc5f617ffb19-hosts-file\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddd68968-e706-4765-85c0-cc5f617ffb19-tmp-dir\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.703774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-cnibin\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blrbd\" (UniqueName: \"kubernetes.io/projected/ef0d702b-9f81-4046-b801-085bdfdf12b5-kube-api-access-blrbd\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ef064a5-78b1-49a5-a46f-8d155af983ba-host\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-dbus\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-system-cni-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57a867c3-e773-4882-a1b2-dc753d0d39ef-konnectivity-ca\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-os-release\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.703992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ef064a5-78b1-49a5-a46f-8d155af983ba-serviceca\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4dh\" (UniqueName: \"kubernetes.io/projected/6ef064a5-78b1-49a5-a46f-8d155af983ba-kube-api-access-5b4dh\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704106 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-system-cni-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-cnibin\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddd68968-e706-4765-85c0-cc5f617ffb19-tmp-dir\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.704234 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ef064a5-78b1-49a5-a46f-8d155af983ba-host\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.704279 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:08.204264687 +0000 UTC m=+3.120318941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/13bf1528-14c5-43a6-a2a9-60cf081b25b0-dbus\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ddd68968-e706-4765-85c0-cc5f617ffb19-hosts-file\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-os-release\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6ef064a5-78b1-49a5-a46f-8d155af983ba-serviceca\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.704808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.704717 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef0d702b-9f81-4046-b801-085bdfdf12b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.705347 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.705072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/57a867c3-e773-4882-a1b2-dc753d0d39ef-konnectivity-ca\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.705347 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.705112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef0d702b-9f81-4046-b801-085bdfdf12b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.706979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.706958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/57a867c3-e773-4882-a1b2-dc753d0d39ef-agent-certs\") pod \"konnectivity-agent-jth69\" (UID: \"57a867c3-e773-4882-a1b2-dc753d0d39ef\") " pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.714769 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.714702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg5w\" (UniqueName: \"kubernetes.io/projected/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-kube-api-access-4vg5w\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:07.714769 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.714723 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:07.714769 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.714738 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:07.714769 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.714753 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.715045 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:07.714815 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:08.214798876 +0000 UTC m=+3.130853136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:07.715576 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.715551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrbd\" (UniqueName: \"kubernetes.io/projected/ef0d702b-9f81-4046-b801-085bdfdf12b5-kube-api-access-blrbd\") pod \"multus-additional-cni-plugins-grm6t\" (UID: \"ef0d702b-9f81-4046-b801-085bdfdf12b5\") " pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.715777 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.715752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxpb\" (UniqueName: \"kubernetes.io/projected/ddd68968-e706-4765-85c0-cc5f617ffb19-kube-api-access-ksxpb\") pod \"node-resolver-rkjln\" (UID: \"ddd68968-e706-4765-85c0-cc5f617ffb19\") " pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.716815 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.716793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4dh\" (UniqueName: \"kubernetes.io/projected/6ef064a5-78b1-49a5-a46f-8d155af983ba-kube-api-access-5b4dh\") pod \"node-ca-47vkz\" (UID: \"6ef064a5-78b1-49a5-a46f-8d155af983ba\") " pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.781881 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.781852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-66jvk" Apr 22 19:23:07.791614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.791591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mncb2" Apr 22 19:23:07.801294 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.801269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:07.805998 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.805972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" Apr 22 19:23:07.812415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.812398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" Apr 22 19:23:07.824910 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.824889 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rkjln" Apr 22 19:23:07.832417 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.832401 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-47vkz" Apr 22 19:23:07.839924 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.839904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:07.846455 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.846433 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-grm6t" Apr 22 19:23:07.967011 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:07.966906 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:08.208157 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.208119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:08.208336 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.208175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:08.208336 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.208308 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:08.208336 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.208320 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:08.208493 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.208383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:09.208362544 +0000 UTC m=+4.124416813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:08.208493 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.208403 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:09.208393406 +0000 UTC m=+4.124447660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:08.309566 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.309485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:08.309730 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.309607 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:08.309730 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.309633 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:08.309730 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.309649 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:08.309730 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.309707 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:09.30968888 +0000 UTC m=+4.225743135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:08.395355 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.395193 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc22288_9935_403f_8e99_11cb3daf1c99.slice/crio-ab2795e19fc7d52186c8f20b259e44d0a4eb9ac3ede48ae56d2c496bbb2ce657 WatchSource:0}: Error finding container ab2795e19fc7d52186c8f20b259e44d0a4eb9ac3ede48ae56d2c496bbb2ce657: Status 404 returned error can't find the container with id ab2795e19fc7d52186c8f20b259e44d0a4eb9ac3ede48ae56d2c496bbb2ce657 Apr 22 19:23:08.397930 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.397902 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ffe4dbc_a0ce_4c8c_a232_6c23ed952136.slice/crio-c0dd885484086a90df1f50538234447c06293a4cd19055307b73c0499d58993d WatchSource:0}: Error finding container c0dd885484086a90df1f50538234447c06293a4cd19055307b73c0499d58993d: Status 404 returned error can't find the container with id c0dd885484086a90df1f50538234447c06293a4cd19055307b73c0499d58993d Apr 22 19:23:08.400311 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.400288 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bd826c_675d_4901_ac56_91d345994e80.slice/crio-e44b9624d29b186c431d35f9fc40fa9d3a63883f671bcd593679afad5e1dac40 WatchSource:0}: Error finding container e44b9624d29b186c431d35f9fc40fa9d3a63883f671bcd593679afad5e1dac40: Status 404 returned error can't find the container with id e44b9624d29b186c431d35f9fc40fa9d3a63883f671bcd593679afad5e1dac40 Apr 22 19:23:08.400903 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.400880 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0d702b_9f81_4046_b801_085bdfdf12b5.slice/crio-ce952262583b50eca76a860d7de90e0d0347d5d015451d21260cbea3a04e2e97 WatchSource:0}: Error finding container ce952262583b50eca76a860d7de90e0d0347d5d015451d21260cbea3a04e2e97: Status 404 returned error can't find the container with id ce952262583b50eca76a860d7de90e0d0347d5d015451d21260cbea3a04e2e97 Apr 22 19:23:08.402358 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.402334 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd68968_e706_4765_85c0_cc5f617ffb19.slice/crio-50d1d82b3344b863000c2f3cf8e56db942d9843db01c70802c9f2154dc3817f1 WatchSource:0}: Error finding container 50d1d82b3344b863000c2f3cf8e56db942d9843db01c70802c9f2154dc3817f1: Status 404 returned error can't find the container with id 50d1d82b3344b863000c2f3cf8e56db942d9843db01c70802c9f2154dc3817f1 Apr 22 19:23:08.403247 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.403199 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fd4978_3887_4451_945f_2523ac01e21d.slice/crio-3321f61632864064a92cf44f3e433d8bbc64aa0a48f2ffdfa2525d07b0c42fa4 WatchSource:0}: Error finding container 3321f61632864064a92cf44f3e433d8bbc64aa0a48f2ffdfa2525d07b0c42fa4: Status 404 returned error can't find the container with id 3321f61632864064a92cf44f3e433d8bbc64aa0a48f2ffdfa2525d07b0c42fa4 Apr 22 19:23:08.404227 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.404180 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef064a5_78b1_49a5_a46f_8d155af983ba.slice/crio-590c2fa9632181fd6a6386aa8716e192d28a28ffe26efd46649a10d7ec49d2f7 WatchSource:0}: Error finding container 590c2fa9632181fd6a6386aa8716e192d28a28ffe26efd46649a10d7ec49d2f7: Status 404 returned error can't find the container with id 590c2fa9632181fd6a6386aa8716e192d28a28ffe26efd46649a10d7ec49d2f7 Apr 22 19:23:08.405333 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.405299 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a867c3_e773_4882_a1b2_dc753d0d39ef.slice/crio-2df0460b534822d1eb42af682f497a14a88095e550e4bf36bc4c47318e99b2cd WatchSource:0}: Error finding container 2df0460b534822d1eb42af682f497a14a88095e550e4bf36bc4c47318e99b2cd: Status 404 returned error can't find the container with id 2df0460b534822d1eb42af682f497a14a88095e550e4bf36bc4c47318e99b2cd Apr 22 19:23:08.405694 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:08.405676 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod721dc9c4_46d2_43f9_960d_7a7ecd3081a9.slice/crio-61a55104e2bd821facae0a6cdc1e7c8d8a0b672f54ec710f7ef9e2e793a1dca5 WatchSource:0}: Error finding container 61a55104e2bd821facae0a6cdc1e7c8d8a0b672f54ec710f7ef9e2e793a1dca5: Status 404 returned error can't find the container with id 61a55104e2bd821facae0a6cdc1e7c8d8a0b672f54ec710f7ef9e2e793a1dca5 Apr 22 19:23:08.520625 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.520589 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:06 +0000 UTC" deadline="2028-01-10 05:27:21.802948584 +0000 UTC" Apr 22 19:23:08.520625 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.520620 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15058h4m13.282330508s" Apr 22 19:23:08.575116 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.575015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:08.575116 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.575014 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:08.575283 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.575144 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:08.575283 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:08.575225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:08.585587 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.585560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" event={"ID":"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136","Type":"ContainerStarted","Data":"c0dd885484086a90df1f50538234447c06293a4cd19055307b73c0499d58993d"} Apr 22 19:23:08.586689 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.586668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"61a55104e2bd821facae0a6cdc1e7c8d8a0b672f54ec710f7ef9e2e793a1dca5"} Apr 22 19:23:08.587717 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.587693 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-66jvk" event={"ID":"f5fd4978-3887-4451-945f-2523ac01e21d","Type":"ContainerStarted","Data":"3321f61632864064a92cf44f3e433d8bbc64aa0a48f2ffdfa2525d07b0c42fa4"} Apr 22 19:23:08.588654 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.588629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" event={"ID":"63bd826c-675d-4901-ac56-91d345994e80","Type":"ContainerStarted","Data":"e44b9624d29b186c431d35f9fc40fa9d3a63883f671bcd593679afad5e1dac40"} Apr 22 19:23:08.589474 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.589440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mncb2" event={"ID":"cdc22288-9935-403f-8e99-11cb3daf1c99","Type":"ContainerStarted","Data":"ab2795e19fc7d52186c8f20b259e44d0a4eb9ac3ede48ae56d2c496bbb2ce657"} Apr 22 19:23:08.590885 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.590863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"6d2285c2e31c1b11f2dd86c4b5e1fd481ef6757ec9e9fbb9cf1bdce35169ad73"} Apr 22 19:23:08.592047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.592027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jth69" event={"ID":"57a867c3-e773-4882-a1b2-dc753d0d39ef","Type":"ContainerStarted","Data":"2df0460b534822d1eb42af682f497a14a88095e550e4bf36bc4c47318e99b2cd"} Apr 22 19:23:08.592804 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.592784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47vkz" event={"ID":"6ef064a5-78b1-49a5-a46f-8d155af983ba","Type":"ContainerStarted","Data":"590c2fa9632181fd6a6386aa8716e192d28a28ffe26efd46649a10d7ec49d2f7"} Apr 22 19:23:08.593779 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.593759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rkjln" event={"ID":"ddd68968-e706-4765-85c0-cc5f617ffb19","Type":"ContainerStarted","Data":"50d1d82b3344b863000c2f3cf8e56db942d9843db01c70802c9f2154dc3817f1"} Apr 22 19:23:08.597210 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.597188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerStarted","Data":"ce952262583b50eca76a860d7de90e0d0347d5d015451d21260cbea3a04e2e97"} Apr 22 19:23:08.603670 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:08.603622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" podStartSLOduration=2.6036089970000003 podStartE2EDuration="2.603608997s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:08.603135747 +0000 UTC m=+3.519190024" watchObservedRunningTime="2026-04-22 19:23:08.603608997 +0000 UTC m=+3.519663273" Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.217948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.218017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.218200 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.218263 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:11.218244062 +0000 UTC m=+6.134298321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.218669 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:09.218809 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.218723 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:11.218707975 +0000 UTC m=+6.134762229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:09.319382 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.319345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:09.319576 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.319557 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:09.319642 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.319588 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:09.319642 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.319600 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:09.319734 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.319662 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:11.319643433 +0000 UTC m=+6.235697690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:09.577772 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.577692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:09.578222 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:09.577838 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:09.617901 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.614091 2578 generic.go:358] "Generic (PLEG): container finished" podID="27d97c8c240a436d06b1c4f45cd224be" containerID="21f64a3c6f453f6c050ec1bf0aa2ef72b2d0378a620b548d3c48f8ac7c78b0cf" exitCode=0 Apr 22 19:23:09.617901 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:09.614439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerDied","Data":"21f64a3c6f453f6c050ec1bf0aa2ef72b2d0378a620b548d3c48f8ac7c78b0cf"} Apr 22 19:23:10.575443 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:10.575407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:10.575618 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:10.575542 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:10.575677 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:10.575646 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:10.575761 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:10.575720 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:10.628694 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:10.628656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"4cefe5208ed62b1863ce147b2ac6b27e78995b3c726da87597b63dbe979ded76"} Apr 22 19:23:11.236466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:11.236404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:11.236466 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:11.236459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:11.236854 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.236577 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:11.236854 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.236638 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:15.236621118 +0000 UTC m=+10.152675380 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:11.237190 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.237092 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:11.237190 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.237147 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:15.237131188 +0000 UTC m=+10.153185449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:11.338502 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:11.337895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:11.338502 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.338068 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:11.338502 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.338087 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:11.338502 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.338100 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:11.338502 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.338160 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:15.338140863 +0000 UTC m=+10.254195139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:11.576174 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:11.576090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:11.576342 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:11.576237 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:12.575783 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:12.575700 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:12.575783 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:12.575741 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:12.576339 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:12.575822 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:12.576339 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:12.576214 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:13.575477 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:13.575373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:13.575671 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:13.575519 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:14.575598 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:14.575548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:14.576127 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:14.575565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:14.576127 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:14.575692 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:14.576127 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:14.575765 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:15.273298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:15.273355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.273482 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.273536 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.273561 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:23.273541068 +0000 UTC m=+18.189595349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:15.273610 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.273589 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:23.273570915 +0000 UTC m=+18.189625174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:15.374280 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:15.374232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:15.374458 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.374409 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:15.374458 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.374434 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:15.374458 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.374448 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:15.374609 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.374512 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:23.374494909 +0000 UTC m=+18.290549170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:15.577396 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:15.577297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:15.577835 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:15.577421 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:16.575082 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:16.574984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:16.575082 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:16.575034 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:16.575337 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:16.575144 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:16.575337 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:16.575274 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:17.575127 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:17.575050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:17.575520 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:17.575184 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:18.574766 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:18.574735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:18.574766 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:18.574761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:18.574995 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:18.574853 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:18.574995 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:18.574924 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:19.575156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:19.575116 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:19.575688 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:19.575281 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:20.574805 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:20.574766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:20.575022 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:20.574766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:20.575022 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:20.574900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:20.575022 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:20.574983 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:21.575624 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:21.575591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:21.576050 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:21.575713 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:22.575415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:22.575368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:22.575415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:22.575413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:22.575667 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:22.575500 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:22.575941 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:22.575653 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:23.335968 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:23.335925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:23.335968 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:23.335976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:23.336268 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.336063 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:23.336268 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.336094 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:23.336268 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.336140 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret podName:13bf1528-14c5-43a6-a2a9-60cf081b25b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.3361228 +0000 UTC m=+34.252177059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret") pod "global-pull-secret-syncer-rgfwb" (UID: "13bf1528-14c5-43a6-a2a9-60cf081b25b0") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:23.336268 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.336157 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.33614887 +0000 UTC m=+34.252203125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:23.437352 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:23.437312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:23.437545 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.437512 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:23.437614 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.437543 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:23.437614 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.437560 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ghbrz for pod openshift-network-diagnostics/network-check-target-spzjt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:23.437720 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.437630 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz podName:115a7622-6567-4b7d-83ff-39248615e827 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.437610729 +0000 UTC m=+34.353665004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghbrz" (UniqueName: "kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz") pod "network-check-target-spzjt" (UID: "115a7622-6567-4b7d-83ff-39248615e827") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:23.574865 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:23.574831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:23.575072 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:23.574963 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:24.575766 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:24.575729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:24.576264 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:24.575729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:24.576264 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:24.575850 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:24.576264 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:24.575916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:25.576106 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:25.576082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:25.576505 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:25.576167 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:26.575309 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.574977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:26.575471 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.574977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:26.575471 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:26.575391 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:26.575471 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:26.575450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:26.657662 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.657621 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jth69" event={"ID":"57a867c3-e773-4882-a1b2-dc753d0d39ef","Type":"ContainerStarted","Data":"c3ca50a6e8733c83c4bc8cb4e3e3e5daa36031af7b1c82f3d2cf098a8bed0508"} Apr 22 19:23:26.659183 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.659157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-47vkz" event={"ID":"6ef064a5-78b1-49a5-a46f-8d155af983ba","Type":"ContainerStarted","Data":"ed6e1774481d1a19fd330952874c2cf80624e57aad9d65dd75f50ba01fcb762d"} Apr 22 19:23:26.660371 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.660345 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rkjln" event={"ID":"ddd68968-e706-4765-85c0-cc5f617ffb19","Type":"ContainerStarted","Data":"18b3e1aca8114313146975c798953f7dd8125131bae422290db923b28dbdfc13"} Apr 22 19:23:26.661944 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.661918 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="6e5f1590a15ce3863f4e0e84956874d8046325aec29debd3a4123887eb602bb8" exitCode=0 Apr 22 19:23:26.662051 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.661990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"6e5f1590a15ce3863f4e0e84956874d8046325aec29debd3a4123887eb602bb8"} Apr 22 19:23:26.663581 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.663557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" event={"ID":"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136","Type":"ContainerStarted","Data":"fc607d268fc0af6f822bc41a992a59c0956b2f4d3a57b659c1b5deca572013e3"} Apr 22 19:23:26.666310 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"1125abbbc776b014c1eb43e198cefa892eb0cbfd6b9e31e2518187f1f53bb16b"} Apr 22 19:23:26.666410 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"371e22c54346bea89f0d4e0a968c152874213f43e3b87dfa8b095aa4194cacd6"} Apr 22 19:23:26.666410 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"ae02abd034dfe04468e13041b996082fff84629ce412535aab9a9387989d1329"} Apr 22 19:23:26.666410 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"328c6cfc1efef4146e6128a93c4fbd3df6740f052667a96a1b48c030816c14b4"} Apr 22 19:23:26.666410 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"4a113ff342af12f0216448cf7a0441665ae54ae9014b1a06550ba682323fa93d"} Apr 22 19:23:26.666410 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.666389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"8d0abaafdd20a7138b4af9d733174cf35fec01222733336ca675dab8d2c03b75"} Apr 22 19:23:26.667556 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.667537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-66jvk" event={"ID":"f5fd4978-3887-4451-945f-2523ac01e21d","Type":"ContainerStarted","Data":"3602d69d637b48eb804cedca10724626eb594e26731720f5e5ec577f1ec6a8d5"} Apr 22 19:23:26.668727 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.668708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" event={"ID":"63bd826c-675d-4901-ac56-91d345994e80","Type":"ContainerStarted","Data":"ea71e06e0e463862877270ef81944e8a265c1b8ab487ec149f5bda78d80a5ea5"} Apr 22 19:23:26.670995 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.670964 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jth69" podStartSLOduration=4.509044665 podStartE2EDuration="21.670955044s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.407747893 +0000 UTC m=+3.323802156" lastFinishedPulling="2026-04-22 19:23:25.569658267 +0000 UTC m=+20.485712535" observedRunningTime="2026-04-22 19:23:26.670877382 +0000 UTC m=+21.586931660" watchObservedRunningTime="2026-04-22 19:23:26.670955044 +0000 UTC m=+21.587009321" Apr 22 19:23:26.671331 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.671309 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" podStartSLOduration=20.671304107 podStartE2EDuration="20.671304107s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:10.643022844 +0000 UTC m=+5.559077119" watchObservedRunningTime="2026-04-22 19:23:26.671304107 +0000 UTC m=+21.587358383" Apr 22 19:23:26.684674 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.684643 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wr9b8" podStartSLOduration=4.51655923 podStartE2EDuration="21.684631619s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.401851119 +0000 UTC m=+3.317905378" lastFinishedPulling="2026-04-22 19:23:25.569923512 +0000 UTC m=+20.485977767" observedRunningTime="2026-04-22 19:23:26.683927449 +0000 UTC m=+21.599981737" watchObservedRunningTime="2026-04-22 19:23:26.684631619 +0000 UTC m=+21.600685896" Apr 22 19:23:26.716807 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.716761 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-66jvk" podStartSLOduration=4.521462885 podStartE2EDuration="21.71674476s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.405116636 +0000 UTC m=+3.321170890" lastFinishedPulling="2026-04-22 19:23:25.600398511 +0000 UTC m=+20.516452765" observedRunningTime="2026-04-22 19:23:26.716652885 +0000 UTC m=+21.632707164" watchObservedRunningTime="2026-04-22 19:23:26.71674476 +0000 UTC m=+21.632799038" Apr 22 19:23:26.732998 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.732959 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rkjln" podStartSLOduration=4.891053459 podStartE2EDuration="21.732949034s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.404228894 +0000 UTC m=+3.320283153" lastFinishedPulling="2026-04-22 19:23:25.246124456 +0000 UTC m=+20.162178728" observedRunningTime="2026-04-22 19:23:26.732843908 +0000 UTC m=+21.648898184" watchObservedRunningTime="2026-04-22 19:23:26.732949034 +0000 UTC m=+21.649003311" Apr 22 19:23:26.747489 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.747453 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-47vkz" podStartSLOduration=4.585561932 podStartE2EDuration="21.747442023s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.40787129 +0000 UTC m=+3.323925547" lastFinishedPulling="2026-04-22 19:23:25.569751382 +0000 UTC m=+20.485805638" observedRunningTime="2026-04-22 19:23:26.746899273 +0000 UTC m=+21.662953564" watchObservedRunningTime="2026-04-22 19:23:26.747442023 +0000 UTC m=+21.663496299" Apr 22 19:23:26.780687 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:26.780665 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:27.544460 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.544243 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:26.780683082Z","UUID":"3503114c-7aea-427e-84a3-317df594794e","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:27.547145 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.547014 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:27.547145 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.547044 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:27.575334 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.575308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:27.575509 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:27.575418 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:27.673392 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.673133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" event={"ID":"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136","Type":"ContainerStarted","Data":"ced443d7df5c20ba51e9af865c4a410a956646e1e52e9c3db39266f5710e103c"} Apr 22 19:23:27.674565 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.674494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mncb2" event={"ID":"cdc22288-9935-403f-8e99-11cb3daf1c99","Type":"ContainerStarted","Data":"5f7d64b91e2556631d1405276c0e2f1744a887cf8410e31661d4873f0968fb52"} Apr 22 19:23:27.688593 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:27.688549 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mncb2" podStartSLOduration=5.515743217 podStartE2EDuration="22.68853482s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.396745588 +0000 UTC m=+3.312799844" lastFinishedPulling="2026-04-22 19:23:25.569537188 +0000 UTC m=+20.485591447" observedRunningTime="2026-04-22 19:23:27.687819412 +0000 UTC m=+22.603873686" watchObservedRunningTime="2026-04-22 19:23:27.68853482 +0000 UTC m=+22.604589097" Apr 22 19:23:28.575794 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:28.575710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:28.576021 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:28.575710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:28.576021 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:28.575842 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:28.576021 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:28.575957 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:28.678378 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:28.678334 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" event={"ID":"3ffe4dbc-a0ce-4c8c-a232-6c23ed952136","Type":"ContainerStarted","Data":"8ef9fc528446e254c1eaa180d98d3b502e374d4f1064e6a9f5d65d1b86cbe590"} Apr 22 19:23:28.681718 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:28.681684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"f80ddf22609e503c0a4303fefb2bc5f5a5e9f20a23767a5a396007125d4ca43e"} Apr 22 19:23:28.720610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:28.720567 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rpr9q" podStartSLOduration=4.490240643 podStartE2EDuration="23.720554369s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.399718259 +0000 UTC m=+3.315772521" lastFinishedPulling="2026-04-22 19:23:27.630031976 +0000 UTC m=+22.546086247" observedRunningTime="2026-04-22 19:23:28.719894972 +0000 UTC m=+23.635949282" watchObservedRunningTime="2026-04-22 19:23:28.720554369 +0000 UTC m=+23.636608646" Apr 22 19:23:29.575199 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:29.575162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:29.575387 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:29.575308 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:29.721886 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:29.721841 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:29.722657 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:29.722638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:30.575104 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:30.575019 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:30.575104 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:30.575046 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:30.575296 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:30.575137 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:30.575296 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:30.575259 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:30.691900 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:30.691877 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:30.692498 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:30.692478 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jth69" Apr 22 19:23:31.575572 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.575391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:31.576138 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:31.575647 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:31.688596 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.688566 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="da52dd8e8abd4aecfe522dacb0eb239d4f5a57b70b8c438757745fe29f3450a0" exitCode=0 Apr 22 19:23:31.688793 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.688658 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"da52dd8e8abd4aecfe522dacb0eb239d4f5a57b70b8c438757745fe29f3450a0"} Apr 22 19:23:31.691866 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.691844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" event={"ID":"721dc9c4-46d2-43f9-960d-7a7ecd3081a9","Type":"ContainerStarted","Data":"35abe317c47c34cb230618e070cf003d9942a3431d572df36853c036b4608ed3"} Apr 22 19:23:31.692740 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.692722 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:31.692835 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.692749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:31.692835 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.692770 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:31.708124 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.708099 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:31.708371 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.708359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:23:31.746675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:31.746635 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" podStartSLOduration=9.383111211 podStartE2EDuration="26.746623073s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.407726358 +0000 UTC m=+3.323780628" lastFinishedPulling="2026-04-22 19:23:25.77123822 +0000 UTC m=+20.687292490" observedRunningTime="2026-04-22 19:23:31.744964181 +0000 UTC m=+26.661018458" watchObservedRunningTime="2026-04-22 19:23:31.746623073 +0000 UTC m=+26.662677350" Apr 22 19:23:32.575538 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.575503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:32.575680 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.575514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:32.575680 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:32.575649 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:32.576074 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:32.575681 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:32.933734 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.933704 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rgfwb"] Apr 22 19:23:32.933891 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.933800 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:32.933958 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:32.933889 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:32.936859 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.936835 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spzjt"] Apr 22 19:23:32.936968 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.936909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:32.937035 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:32.936974 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:32.939676 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.939654 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-czpht"] Apr 22 19:23:32.939773 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:32.939747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:32.939843 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:32.939822 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:33.699874 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:33.699687 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="9bf5dd6ecd1e1d8cd914f4e9587c8f2f48196f0961478c9bf4194bf6f522a00a" exitCode=0 Apr 22 19:23:33.700266 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:33.699762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"9bf5dd6ecd1e1d8cd914f4e9587c8f2f48196f0961478c9bf4194bf6f522a00a"} Apr 22 19:23:34.575067 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:34.575022 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:34.575250 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:34.575028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:34.575250 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:34.575149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:34.575250 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:34.575047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:34.575406 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:34.575195 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:34.575406 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:34.575277 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:34.704441 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:34.704369 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="fd85a0375239dcca44e210275e283204f5c3c00a26e750f21ef84bf65e7817c6" exitCode=0 Apr 22 19:23:34.704441 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:34.704428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"fd85a0375239dcca44e210275e283204f5c3c00a26e750f21ef84bf65e7817c6"} Apr 22 19:23:36.575738 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:36.575706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:36.576333 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:36.575706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:36.576333 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:36.575824 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spzjt" podUID="115a7622-6567-4b7d-83ff-39248615e827" Apr 22 19:23:36.576333 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:36.575894 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rgfwb" podUID="13bf1528-14c5-43a6-a2a9-60cf081b25b0" Apr 22 19:23:36.576333 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:36.575706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:36.576333 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:36.575994 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:23:37.456600 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.456571 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeReady" Apr 22 19:23:37.456789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.456730 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:23:37.520833 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.520802 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d4fcf77b6-9bwd5"] Apr 22 19:23:37.539841 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.538851 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d4fcf77b6-9bwd5"] Apr 22 19:23:37.539841 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.539053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.539841 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.539548 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-56v9g"] Apr 22 19:23:37.541789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.541766 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:23:37.541789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.541791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:23:37.541967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.541832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:23:37.545093 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.543028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8fmt6\"" Apr 22 19:23:37.550179 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.550158 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:23:37.553675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.553656 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zx6g9"] Apr 22 19:23:37.553817 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.553801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.556150 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.556119 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:23:37.556262 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.556240 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:23:37.556375 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.556336 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:23:37.568456 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.568434 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-56v9g"] Apr 22 19:23:37.568581 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.568466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zx6g9"] Apr 22 19:23:37.568581 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.568574 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:37.571166 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.571147 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:23:37.571441 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.571424 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:23:37.571526 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.571454 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:23:37.571526 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.571466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:23:37.642979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.642947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.642979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.642981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b284839-b3dc-4bf0-b479-744c1da18b4b-config-volume\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfqw5\" (UniqueName: \"kubernetes.io/projected/0b284839-b3dc-4bf0-b479-744c1da18b4b-kube-api-access-kfqw5\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx89j\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b284839-b3dc-4bf0-b479-744c1da18b4b-tmp-dir\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.643579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.643438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.744681 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b284839-b3dc-4bf0-b479-744c1da18b4b-tmp-dir\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gmz\" (UniqueName: \"kubernetes.io/projected/75f18f37-b8b6-4514-90d1-259b37372b4b-kube-api-access-k4gmz\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b284839-b3dc-4bf0-b479-744c1da18b4b-config-volume\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.744943 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.744954 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.744980 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b284839-b3dc-4bf0-b479-744c1da18b4b-tmp-dir\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.744958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfqw5\" (UniqueName: \"kubernetes.io/projected/0b284839-b3dc-4bf0-b479-744c1da18b4b-kube-api-access-kfqw5\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.745097 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.24507258 +0000 UTC m=+33.161126840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745271 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.745523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sx89j\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.745614 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.745604 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:37.745823 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.745664 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.245647953 +0000 UTC m=+33.161702225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:37.746053 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.746033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.746138 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.746054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.749737 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.749679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.749737 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.749693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.751523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.751374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b284839-b3dc-4bf0-b479-744c1da18b4b-config-volume\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.754032 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.753990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.754118 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.754026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx89j\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:37.754118 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.754033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfqw5\" (UniqueName: \"kubernetes.io/projected/0b284839-b3dc-4bf0-b479-744c1da18b4b-kube-api-access-kfqw5\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:37.846305 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.846266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gmz\" (UniqueName: \"kubernetes.io/projected/75f18f37-b8b6-4514-90d1-259b37372b4b-kube-api-access-k4gmz\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:37.846509 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.846407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:37.846574 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.846505 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:37.846628 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:37.846579 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:38.34656083 +0000 UTC m=+33.262615101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:23:37.857461 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:37.857422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gmz\" (UniqueName: \"kubernetes.io/projected/75f18f37-b8b6-4514-90d1-259b37372b4b-kube-api-access-k4gmz\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:38.249896 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.249812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:38.249896 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.249862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:38.250223 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.249976 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:38.250223 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.249995 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:38.250223 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.249975 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:38.250223 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.250076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.250055515 +0000 UTC m=+34.166109771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:38.250223 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.250132 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.250106975 +0000 UTC m=+34.166161234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:38.350488 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.350448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:38.350674 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.350617 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:38.350738 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:38.350700 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.35067924 +0000 UTC m=+34.266733514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:23:38.575554 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.575516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:38.575732 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.575515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:38.575799 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.575516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:38.579818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:23:38.579818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579613 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:23:38.579818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:23:38.579818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579752 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:23:38.580127 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:23:38.580127 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:38.579875 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8qpcz\"" Apr 22 19:23:39.259590 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.259549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.259612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.259731 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.259807 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:41.259784795 +0000 UTC m=+36.175839064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.259737 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.259840 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:39.260030 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.259893 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:23:41.25987713 +0000 UTC m=+36.175931390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:39.360964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.360919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:39.360964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.360965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:23:39.361199 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.361094 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:23:39.361199 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.361157 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:11.361135588 +0000 UTC m=+66.277189847 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : secret "metrics-daemon-secret" not found Apr 22 19:23:39.361199 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.361181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:39.361349 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.361323 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:39.361388 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:39.361362 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:41.361351777 +0000 UTC m=+36.277406036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:23:39.363464 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.363437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/13bf1528-14c5-43a6-a2a9-60cf081b25b0-original-pull-secret\") pod \"global-pull-secret-syncer-rgfwb\" (UID: \"13bf1528-14c5-43a6-a2a9-60cf081b25b0\") " pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:39.462122 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.462084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:39.465050 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.465024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbrz\" (UniqueName: \"kubernetes.io/projected/115a7622-6567-4b7d-83ff-39248615e827-kube-api-access-ghbrz\") pod \"network-check-target-spzjt\" (UID: \"115a7622-6567-4b7d-83ff-39248615e827\") " pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:39.496304 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.496268 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:39.502102 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:39.502079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rgfwb" Apr 22 19:23:40.433423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:40.433167 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rgfwb"] Apr 22 19:23:40.433944 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:40.433855 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spzjt"] Apr 22 19:23:40.483218 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:40.483184 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115a7622_6567_4b7d_83ff_39248615e827.slice/crio-e0958f4499d53fda2c6bb5665d8c7ad60915278d8c12ac73a34b40d0d6241346 WatchSource:0}: Error finding container e0958f4499d53fda2c6bb5665d8c7ad60915278d8c12ac73a34b40d0d6241346: Status 404 returned error can't find the container with id e0958f4499d53fda2c6bb5665d8c7ad60915278d8c12ac73a34b40d0d6241346 Apr 22 19:23:40.484859 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:23:40.484817 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13bf1528_14c5_43a6_a2a9_60cf081b25b0.slice/crio-892a5a10c85a8e43b69afc02c40b463686f979288fdb4ab11e84de08a7b92808 WatchSource:0}: Error finding container 892a5a10c85a8e43b69afc02c40b463686f979288fdb4ab11e84de08a7b92808: Status 404 returned error can't find the container with id 892a5a10c85a8e43b69afc02c40b463686f979288fdb4ab11e84de08a7b92808 Apr 22 19:23:40.719155 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:40.719115 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerStarted","Data":"d3055413b4fa21f9a1cd44b836fac3c0c7116b008ebd00c34520c5466b039e0c"} Apr 22 19:23:40.720287 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:40.720260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rgfwb" event={"ID":"13bf1528-14c5-43a6-a2a9-60cf081b25b0","Type":"ContainerStarted","Data":"892a5a10c85a8e43b69afc02c40b463686f979288fdb4ab11e84de08a7b92808"} Apr 22 19:23:40.721300 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:40.721272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spzjt" event={"ID":"115a7622-6567-4b7d-83ff-39248615e827","Type":"ContainerStarted","Data":"e0958f4499d53fda2c6bb5665d8c7ad60915278d8c12ac73a34b40d0d6241346"} Apr 22 19:23:41.275093 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:41.275052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:41.275284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:41.275115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:41.275284 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.275205 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:41.275284 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.275253 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:41.275284 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.275267 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:41.275443 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.275289 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.275267169 +0000 UTC m=+40.191321431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:41.275443 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.275306 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.275297694 +0000 UTC m=+40.191351950 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:41.375722 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:41.375684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:41.375906 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.375831 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:41.375963 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:41.375943 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:45.375921177 +0000 UTC m=+40.291975449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:23:41.726827 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:41.726765 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="d3055413b4fa21f9a1cd44b836fac3c0c7116b008ebd00c34520c5466b039e0c" exitCode=0 Apr 22 19:23:41.726827 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:41.726821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"d3055413b4fa21f9a1cd44b836fac3c0c7116b008ebd00c34520c5466b039e0c"} Apr 22 19:23:42.731105 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:42.730866 2578 generic.go:358] "Generic (PLEG): container finished" podID="ef0d702b-9f81-4046-b801-085bdfdf12b5" containerID="b76673650416845981fef2ac05764e2a284b5054a91bb5e211d6b9b376089e6e" exitCode=0 Apr 22 19:23:42.731105 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:42.730947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerDied","Data":"b76673650416845981fef2ac05764e2a284b5054a91bb5e211d6b9b376089e6e"} Apr 22 19:23:43.737018 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:43.736969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grm6t" event={"ID":"ef0d702b-9f81-4046-b801-085bdfdf12b5","Type":"ContainerStarted","Data":"d53d77a5a2fc5b27bdb1f35640a26c024103f2e3d3c8f299b58b3385374684a6"} Apr 22 19:23:43.765275 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:43.764761 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-grm6t" podStartSLOduration=6.655380086 podStartE2EDuration="38.764739409s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:08.402832454 +0000 UTC m=+3.318886722" lastFinishedPulling="2026-04-22 19:23:40.512191789 +0000 UTC m=+35.428246045" observedRunningTime="2026-04-22 19:23:43.762673918 +0000 UTC m=+38.678728200" watchObservedRunningTime="2026-04-22 19:23:43.764739409 +0000 UTC m=+38.680793688" Apr 22 19:23:45.306884 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:45.306842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:45.306962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.306992 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.307036 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.307088 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.307103 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.307082338 +0000 UTC m=+48.223136597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:45.307500 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.307139 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.307122166 +0000 UTC m=+48.223176422 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:45.407647 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:45.407613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:45.407802 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.407787 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:45.407867 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:45.407859 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.40784083 +0000 UTC m=+48.323895084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:23:46.743883 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:46.743851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rgfwb" event={"ID":"13bf1528-14c5-43a6-a2a9-60cf081b25b0","Type":"ContainerStarted","Data":"c9829519329536e36e949c10bce5c2517fcf5ec2562001b495be70a25e1de514"} Apr 22 19:23:46.745146 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:46.745123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spzjt" event={"ID":"115a7622-6567-4b7d-83ff-39248615e827","Type":"ContainerStarted","Data":"a5cde3f7b4b6a092bd2bb84c189cc7caed4e619e11812c1c5586f5e84a68aaeb"} Apr 22 19:23:46.745262 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:46.745250 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:23:46.758456 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:46.758416 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rgfwb" podStartSLOduration=34.922866617 podStartE2EDuration="40.758403358s" podCreationTimestamp="2026-04-22 19:23:06 +0000 UTC" firstStartedPulling="2026-04-22 19:23:40.489925871 +0000 UTC m=+35.405980127" lastFinishedPulling="2026-04-22 19:23:46.325462613 +0000 UTC m=+41.241516868" observedRunningTime="2026-04-22 19:23:46.758117422 +0000 UTC m=+41.674171699" watchObservedRunningTime="2026-04-22 19:23:46.758403358 +0000 UTC m=+41.674457634" Apr 22 19:23:46.774195 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:46.774128 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-spzjt" podStartSLOduration=35.948886738 podStartE2EDuration="41.774114599s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:23:40.48993003 +0000 UTC m=+35.405984284" lastFinishedPulling="2026-04-22 19:23:46.315157877 +0000 UTC m=+41.231212145" observedRunningTime="2026-04-22 19:23:46.773410216 +0000 UTC m=+41.689464484" watchObservedRunningTime="2026-04-22 19:23:46.774114599 +0000 UTC m=+41.690168877" Apr 22 19:23:53.360281 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:53.360234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:23:53.360281 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:53.360285 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:23:53.360697 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.360385 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:53.360697 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.360386 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:53.360697 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.360451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.360434272 +0000 UTC m=+64.276488527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:23:53.360697 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.360395 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:23:53.360697 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.360486 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.360478552 +0000 UTC m=+64.276532807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:23:53.460910 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:23:53.460881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:23:53.461055 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.461032 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:53.461094 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:23:53.461087 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:09.461073386 +0000 UTC m=+64.377127641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:24:03.709803 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:03.709775 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cz572" Apr 22 19:24:09.375424 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:09.375384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:09.375430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.375549 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.375613 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:41.375597804 +0000 UTC m=+96.291652064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.375552 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.375636 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:24:09.375878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.375670 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:24:41.375661707 +0000 UTC m=+96.291715976 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:24:09.476692 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:09.476654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:24:09.476827 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.476807 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:09.476885 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:09.476872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:24:41.476856184 +0000 UTC m=+96.392910439 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:24:11.389808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:11.389759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:24:11.390225 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:11.389907 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:11.390225 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:11.389985 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:15.389969507 +0000 UTC m=+130.306023761 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : secret "metrics-daemon-secret" not found Apr 22 19:24:17.749398 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:17.749366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-spzjt" Apr 22 19:24:41.403078 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:41.403037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:41.403116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.403185 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.403201 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.403206 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.403259 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.403247101 +0000 UTC m=+160.319301355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:24:41.403590 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.403272 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.403266303 +0000 UTC m=+160.319320559 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:24:41.503958 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:24:41.503920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:24:41.504148 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.504069 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:41.504148 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:24:41.504129 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.504114301 +0000 UTC m=+160.420168556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:25:15.447415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:15.447365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:25:15.448075 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:15.447532 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:15.448075 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:15.447617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs podName:a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:17.44759721 +0000 UTC m=+252.363651468 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs") pod "network-metrics-daemon-czpht" (UID: "a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4") : secret "metrics-daemon-secret" not found Apr 22 19:25:36.753846 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.753808 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8"] Apr 22 19:25:36.758386 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.758358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.762052 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.762029 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:25:36.762165 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.762069 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.763013 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.762980 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:25:36.763079 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.762997 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.763079 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.763064 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-4pfsv\"" Apr 22 19:25:36.772408 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.772386 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8"] Apr 22 19:25:36.798080 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.798053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgksp\" (UniqueName: \"kubernetes.io/projected/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-kube-api-access-wgksp\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.798235 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.798122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.798235 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.798199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.899419 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.899386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.899531 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.899434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.899531 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.899461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgksp\" (UniqueName: \"kubernetes.io/projected/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-kube-api-access-wgksp\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.899918 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.899901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.903166 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.903146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:36.909953 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:36.909928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgksp\" (UniqueName: \"kubernetes.io/projected/e32d7faf-e9d5-42d2-a5b0-ac06cb089d17-kube-api-access-wgksp\") pod \"kube-storage-version-migrator-operator-6769c5d45-khgg8\" (UID: \"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:37.067230 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:37.067198 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" Apr 22 19:25:37.187497 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:37.187467 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8"] Apr 22 19:25:37.190732 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:25:37.190706 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32d7faf_e9d5_42d2_a5b0_ac06cb089d17.slice/crio-5195530b174b226cc453a363addc7cc63b04583c508942263078632ba8e2f67c WatchSource:0}: Error finding container 5195530b174b226cc453a363addc7cc63b04583c508942263078632ba8e2f67c: Status 404 returned error can't find the container with id 5195530b174b226cc453a363addc7cc63b04583c508942263078632ba8e2f67c Apr 22 19:25:37.949219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:37.949180 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" event={"ID":"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17","Type":"ContainerStarted","Data":"5195530b174b226cc453a363addc7cc63b04583c508942263078632ba8e2f67c"} Apr 22 19:25:39.954774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:39.954738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" event={"ID":"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17","Type":"ContainerStarted","Data":"55dbb3b92e6c26813a9f17df3ff3c70f9bd93aafe1bf616b4af4f58d7a6036cb"} Apr 22 19:25:39.970339 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:39.970289 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" podStartSLOduration=2.015864017 podStartE2EDuration="3.970274143s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:37.192429651 +0000 UTC m=+152.108483906" lastFinishedPulling="2026-04-22 19:25:39.146839777 +0000 UTC m=+154.062894032" observedRunningTime="2026-04-22 19:25:39.969689438 +0000 UTC m=+154.885743718" watchObservedRunningTime="2026-04-22 19:25:39.970274143 +0000 UTC m=+154.886328453" Apr 22 19:25:40.555886 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:40.555841 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" podUID="c23b359a-a238-4cae-b9b5-646b1b984bcf" Apr 22 19:25:40.563605 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:40.563577 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-56v9g" podUID="0b284839-b3dc-4bf0-b479-744c1da18b4b" Apr 22 19:25:40.579878 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:40.579850 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zx6g9" podUID="75f18f37-b8b6-4514-90d1-259b37372b4b" Apr 22 19:25:40.957098 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:40.957066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-56v9g" Apr 22 19:25:41.587700 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:41.587659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-czpht" podUID="a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4" Apr 22 19:25:42.990930 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.990892 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qzndc"] Apr 22 19:25:42.994037 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.993998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:42.996547 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.996525 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:25:42.996638 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.996589 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:25:42.996684 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.996589 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:25:42.997768 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.997748 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bw56s\"" Apr 22 19:25:42.997831 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:42.997816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:25:43.001861 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.001835 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qzndc"] Apr 22 19:25:43.047357 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.047323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-key\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.047527 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.047372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-cabundle\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.047527 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.047395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ckg\" (UniqueName: \"kubernetes.io/projected/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-kube-api-access-q6ckg\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.148280 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.148243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-cabundle\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.148442 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.148286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ckg\" (UniqueName: \"kubernetes.io/projected/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-kube-api-access-q6ckg\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.148442 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.148381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-key\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.148892 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.148870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-cabundle\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.150761 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.150740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-signing-key\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.156673 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.156653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ckg\" (UniqueName: \"kubernetes.io/projected/0969b2aa-c622-4ac5-91cf-7f9144e3cc8b-kube-api-access-q6ckg\") pod \"service-ca-865cb79987-qzndc\" (UID: \"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b\") " pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.303382 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.303300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-qzndc" Apr 22 19:25:43.416944 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.416910 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-qzndc"] Apr 22 19:25:43.419768 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:25:43.419739 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0969b2aa_c622_4ac5_91cf_7f9144e3cc8b.slice/crio-d2308a0f8393efa98ac53ce9518aee92627185ccad30cf8b093e6a53177b7f86 WatchSource:0}: Error finding container d2308a0f8393efa98ac53ce9518aee92627185ccad30cf8b093e6a53177b7f86: Status 404 returned error can't find the container with id d2308a0f8393efa98ac53ce9518aee92627185ccad30cf8b093e6a53177b7f86 Apr 22 19:25:43.945460 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.945428 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkjln_ddd68968-e706-4765-85c0-cc5f617ffb19/dns-node-resolver/0.log" Apr 22 19:25:43.963553 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:43.963516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-qzndc" event={"ID":"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b","Type":"ContainerStarted","Data":"d2308a0f8393efa98ac53ce9518aee92627185ccad30cf8b093e6a53177b7f86"} Apr 22 19:25:44.744418 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:44.744386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-47vkz_6ef064a5-78b1-49a5-a46f-8d155af983ba/node-ca/0.log" Apr 22 19:25:45.463642 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:45.463611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:25:45.463849 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:45.463653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") pod \"image-registry-6d4fcf77b6-9bwd5\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:25:45.463849 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.463743 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:45.463849 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.463758 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d4fcf77b6-9bwd5: secret "image-registry-tls" not found Apr 22 19:25:45.463849 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.463797 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls podName:c23b359a-a238-4cae-b9b5-646b1b984bcf nodeName:}" failed. No retries permitted until 2026-04-22 19:27:47.463784076 +0000 UTC m=+282.379838330 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls") pod "image-registry-6d4fcf77b6-9bwd5" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf") : secret "image-registry-tls" not found Apr 22 19:25:45.463849 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.463742 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:45.464054 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.463867 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls podName:0b284839-b3dc-4bf0-b479-744c1da18b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:27:47.46385521 +0000 UTC m=+282.379909464 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls") pod "dns-default-56v9g" (UID: "0b284839-b3dc-4bf0-b479-744c1da18b4b") : secret "dns-default-metrics-tls" not found Apr 22 19:25:45.564158 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:45.564127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:25:45.564320 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.564268 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:45.564368 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:25:45.564328 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert podName:75f18f37-b8b6-4514-90d1-259b37372b4b nodeName:}" failed. No retries permitted until 2026-04-22 19:27:47.564311097 +0000 UTC m=+282.480365376 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert") pod "ingress-canary-zx6g9" (UID: "75f18f37-b8b6-4514-90d1-259b37372b4b") : secret "canary-serving-cert" not found Apr 22 19:25:45.972119 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:45.972082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-qzndc" event={"ID":"0969b2aa-c622-4ac5-91cf-7f9144e3cc8b","Type":"ContainerStarted","Data":"5d5fa0d298176faae09dcc8db6c0c933de708ca652afbb81d21ca32d0f82dff5"} Apr 22 19:25:45.987694 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:45.987639 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-qzndc" podStartSLOduration=2.470068001 podStartE2EDuration="3.987622617s" podCreationTimestamp="2026-04-22 19:25:42 +0000 UTC" firstStartedPulling="2026-04-22 19:25:43.421490141 +0000 UTC m=+158.337544396" lastFinishedPulling="2026-04-22 19:25:44.939044756 +0000 UTC m=+159.855099012" observedRunningTime="2026-04-22 19:25:45.987210861 +0000 UTC m=+160.903265137" watchObservedRunningTime="2026-04-22 19:25:45.987622617 +0000 UTC m=+160.903676896" Apr 22 19:25:46.546017 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:46.545959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-khgg8_e32d7faf-e9d5-42d2-a5b0-ac06cb089d17/kube-storage-version-migrator-operator/0.log" Apr 22 19:25:51.575089 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:51.575060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:25:53.575655 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:53.575621 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:25:57.575582 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:25:57.575505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:26:04.167147 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.167108 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8"] Apr 22 19:26:04.170642 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.170624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.174713 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.174693 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:26:04.174817 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.174696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gl6zz\"" Apr 22 19:26:04.174817 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.174698 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:26:04.185956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.185937 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8"] Apr 22 19:26:04.249282 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.249254 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ln9lz"] Apr 22 19:26:04.252426 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.252405 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.254965 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.254946 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:04.255081 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.255046 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:04.255235 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.255221 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:26:04.255463 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.255448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:26:04.256150 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.256132 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2cndx\"" Apr 22 19:26:04.264200 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.264181 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-99qxx"] Apr 22 19:26:04.267156 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.267142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:04.269682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.269662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ln9lz"] Apr 22 19:26:04.270149 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.270135 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dqlhx\"" Apr 22 19:26:04.270476 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.270462 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:26:04.270928 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.270908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:26:04.283141 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.283118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-99qxx"] Apr 22 19:26:04.305430 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/67e410f9-b014-4163-a371-50beadc36300-data-volume\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.305604 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/67e410f9-b014-4163-a371-50beadc36300-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.305604 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9013b10-84e5-4801-a992-369ae0ce0e83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.305604 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxnc\" (UniqueName: \"kubernetes.io/projected/67e410f9-b014-4163-a371-50beadc36300-kube-api-access-9xxnc\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.305604 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/67e410f9-b014-4163-a371-50beadc36300-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.305791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/67e410f9-b014-4163-a371-50beadc36300-crio-socket\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.305791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.305681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9013b10-84e5-4801-a992-369ae0ce0e83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.406194 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/67e410f9-b014-4163-a371-50beadc36300-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406194 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9013b10-84e5-4801-a992-369ae0ce0e83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.406423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxnc\" (UniqueName: \"kubernetes.io/projected/67e410f9-b014-4163-a371-50beadc36300-kube-api-access-9xxnc\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl66w\" (UniqueName: \"kubernetes.io/projected/39b0850e-a695-4553-ac06-dff279c97d24-kube-api-access-nl66w\") pod \"downloads-6bcc868b7-99qxx\" (UID: \"39b0850e-a695-4553-ac06-dff279c97d24\") " pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:04.406423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/67e410f9-b014-4163-a371-50beadc36300-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/67e410f9-b014-4163-a371-50beadc36300-crio-socket\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406423 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9013b10-84e5-4801-a992-369ae0ce0e83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.406659 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/67e410f9-b014-4163-a371-50beadc36300-data-volume\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406659 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/67e410f9-b014-4163-a371-50beadc36300-crio-socket\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406786 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/67e410f9-b014-4163-a371-50beadc36300-data-volume\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.406944 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.406928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/67e410f9-b014-4163-a371-50beadc36300-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.407108 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.407090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f9013b10-84e5-4801-a992-369ae0ce0e83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.408537 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.408509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/67e410f9-b014-4163-a371-50beadc36300-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.408648 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.408632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f9013b10-84e5-4801-a992-369ae0ce0e83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mkkg8\" (UID: \"f9013b10-84e5-4801-a992-369ae0ce0e83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.423251 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.423203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxnc\" (UniqueName: \"kubernetes.io/projected/67e410f9-b014-4163-a371-50beadc36300-kube-api-access-9xxnc\") pod \"insights-runtime-extractor-ln9lz\" (UID: \"67e410f9-b014-4163-a371-50beadc36300\") " pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.479392 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.479369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" Apr 22 19:26:04.507284 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.507256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl66w\" (UniqueName: \"kubernetes.io/projected/39b0850e-a695-4553-ac06-dff279c97d24-kube-api-access-nl66w\") pod \"downloads-6bcc868b7-99qxx\" (UID: \"39b0850e-a695-4553-ac06-dff279c97d24\") " pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:04.519774 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.519747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl66w\" (UniqueName: \"kubernetes.io/projected/39b0850e-a695-4553-ac06-dff279c97d24-kube-api-access-nl66w\") pod \"downloads-6bcc868b7-99qxx\" (UID: \"39b0850e-a695-4553-ac06-dff279c97d24\") " pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:04.560593 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.560567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ln9lz" Apr 22 19:26:04.575771 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.575743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:04.609123 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.609092 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8"] Apr 22 19:26:04.705142 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.705074 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ln9lz"] Apr 22 19:26:04.708349 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:04.708317 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e410f9_b014_4163_a371_50beadc36300.slice/crio-d5b433e0078b5b9aa5dd0b6d0876b61ffe1024f7bd801d4936fe68fdfd068139 WatchSource:0}: Error finding container d5b433e0078b5b9aa5dd0b6d0876b61ffe1024f7bd801d4936fe68fdfd068139: Status 404 returned error can't find the container with id d5b433e0078b5b9aa5dd0b6d0876b61ffe1024f7bd801d4936fe68fdfd068139 Apr 22 19:26:04.711533 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:04.711506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-99qxx"] Apr 22 19:26:04.715167 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:04.715146 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b0850e_a695_4553_ac06_dff279c97d24.slice/crio-3dd39d7ffe4ceff0219937d3d40a5a47b3f1abddef6a3dbc5055eb0f2733ed4a WatchSource:0}: Error finding container 3dd39d7ffe4ceff0219937d3d40a5a47b3f1abddef6a3dbc5055eb0f2733ed4a: Status 404 returned error can't find the container with id 3dd39d7ffe4ceff0219937d3d40a5a47b3f1abddef6a3dbc5055eb0f2733ed4a Apr 22 19:26:05.006885 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:05.006796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-99qxx" event={"ID":"39b0850e-a695-4553-ac06-dff279c97d24","Type":"ContainerStarted","Data":"3dd39d7ffe4ceff0219937d3d40a5a47b3f1abddef6a3dbc5055eb0f2733ed4a"} Apr 22 19:26:05.008094 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:05.008067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ln9lz" event={"ID":"67e410f9-b014-4163-a371-50beadc36300","Type":"ContainerStarted","Data":"c6d039183c93fc77357265cb00190c72f62eb244587eb3b7b3acd1af0fe48832"} Apr 22 19:26:05.008224 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:05.008101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ln9lz" event={"ID":"67e410f9-b014-4163-a371-50beadc36300","Type":"ContainerStarted","Data":"d5b433e0078b5b9aa5dd0b6d0876b61ffe1024f7bd801d4936fe68fdfd068139"} Apr 22 19:26:05.009034 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:05.008998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" event={"ID":"f9013b10-84e5-4801-a992-369ae0ce0e83","Type":"ContainerStarted","Data":"15eb80761ed4dd9d1728e25ae2cc79025f5b158c40c5d4dcd6adb7811434ffde"} Apr 22 19:26:06.014682 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:06.014626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ln9lz" event={"ID":"67e410f9-b014-4163-a371-50beadc36300","Type":"ContainerStarted","Data":"70271c8c5f6d5c67499624e92a2ac9a40569c584794495f71f88924f03f71734"} Apr 22 19:26:06.016121 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:06.016081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" event={"ID":"f9013b10-84e5-4801-a992-369ae0ce0e83","Type":"ContainerStarted","Data":"a7b6dce06595c83520418fd53d1c92346d052b90f729fe7dca0b75832d4993ee"} Apr 22 19:26:06.033200 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:06.033149 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mkkg8" podStartSLOduration=0.841723127 podStartE2EDuration="2.03313124s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:04.610740639 +0000 UTC m=+179.526794897" lastFinishedPulling="2026-04-22 19:26:05.802148542 +0000 UTC m=+180.718203010" observedRunningTime="2026-04-22 19:26:06.032051664 +0000 UTC m=+180.948105940" watchObservedRunningTime="2026-04-22 19:26:06.03313124 +0000 UTC m=+180.949185518" Apr 22 19:26:07.020837 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:07.020803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ln9lz" event={"ID":"67e410f9-b014-4163-a371-50beadc36300","Type":"ContainerStarted","Data":"7799cc88e34cc8d3f5c1ed55f315403e8894d05ed3dd1bba7cc227b99739f3c9"} Apr 22 19:26:07.039245 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:07.039188 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ln9lz" podStartSLOduration=0.91835333 podStartE2EDuration="3.039170097s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:04.763460229 +0000 UTC m=+179.679514484" lastFinishedPulling="2026-04-22 19:26:06.884276996 +0000 UTC m=+181.800331251" observedRunningTime="2026-04-22 19:26:07.038748888 +0000 UTC m=+181.954803165" watchObservedRunningTime="2026-04-22 19:26:07.039170097 +0000 UTC m=+181.955224375" Apr 22 19:26:10.411564 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.411528 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:10.414670 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.414648 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.418918 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.418894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:26:10.419605 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.419576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:26:10.419979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.419961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-brsqm\"" Apr 22 19:26:10.420081 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.419961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:26:10.420081 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.420064 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:26:10.420173 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.420081 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:26:10.426741 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.426718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:10.557992 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.557955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.558166 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.558019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.558249 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.558163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.558249 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.558202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvps\" (UniqueName: \"kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.558249 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.558233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.558379 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.558326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.658801 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.658967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.658967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658892 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.658967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvps\" (UniqueName: \"kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.658967 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.659276 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.658998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.659561 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.659530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.659561 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.659548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.659727 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.659710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.661725 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.661670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.661725 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.661694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.668498 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.668469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvps\" (UniqueName: \"kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps\") pod \"console-69848b4c78-7szq8\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.725472 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.725440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:10.864493 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.864462 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:10.870213 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.870096 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52"] Apr 22 19:26:10.874665 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.874609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:10.877330 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.877231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2jzg9\"" Apr 22 19:26:10.877429 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.877392 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:26:10.883601 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.883567 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52"] Apr 22 19:26:10.962365 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:10.962294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5cd52\" (UID: \"7dd66eca-bc70-412c-a906-24c51d1c9fa2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:11.031688 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.031650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69848b4c78-7szq8" event={"ID":"4eee5b39-e330-40e6-8763-99f0426fa62e","Type":"ContainerStarted","Data":"d8e699250a1d395b89923b460eeb4a443f9432af583c3e556f92c7a274616662"} Apr 22 19:26:11.063153 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.063121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5cd52\" (UID: \"7dd66eca-bc70-412c-a906-24c51d1c9fa2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:11.063300 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:11.063248 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 19:26:11.063362 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:11.063315 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates podName:7dd66eca-bc70-412c-a906-24c51d1c9fa2 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:11.563295691 +0000 UTC m=+186.479349949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-5cd52" (UID: "7dd66eca-bc70-412c-a906-24c51d1c9fa2") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 19:26:11.568147 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.568109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5cd52\" (UID: \"7dd66eca-bc70-412c-a906-24c51d1c9fa2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:11.570908 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.570884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/7dd66eca-bc70-412c-a906-24c51d1c9fa2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5cd52\" (UID: \"7dd66eca-bc70-412c-a906-24c51d1c9fa2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:11.786672 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.786622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:11.929882 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:11.929848 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52"] Apr 22 19:26:11.933783 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:11.933752 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd66eca_bc70_412c_a906_24c51d1c9fa2.slice/crio-310fe920cbc8813b4efb4fd56a3da6e6ab0906a247e8bb3766c3491279c209d7 WatchSource:0}: Error finding container 310fe920cbc8813b4efb4fd56a3da6e6ab0906a247e8bb3766c3491279c209d7: Status 404 returned error can't find the container with id 310fe920cbc8813b4efb4fd56a3da6e6ab0906a247e8bb3766c3491279c209d7 Apr 22 19:26:12.036261 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:12.036223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" event={"ID":"7dd66eca-bc70-412c-a906-24c51d1c9fa2","Type":"ContainerStarted","Data":"310fe920cbc8813b4efb4fd56a3da6e6ab0906a247e8bb3766c3491279c209d7"} Apr 22 19:26:21.061658 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.061613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69848b4c78-7szq8" event={"ID":"4eee5b39-e330-40e6-8763-99f0426fa62e","Type":"ContainerStarted","Data":"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4"} Apr 22 19:26:21.063317 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.063280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" event={"ID":"7dd66eca-bc70-412c-a906-24c51d1c9fa2","Type":"ContainerStarted","Data":"8b1263a2ede719d76be71277e68d9a7f69628bcbe4d46d59a65127dd01964236"} Apr 22 19:26:21.063507 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.063480 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:21.064863 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.064819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-99qxx" event={"ID":"39b0850e-a695-4553-ac06-dff279c97d24","Type":"ContainerStarted","Data":"7af63393e29b098f8a083ceb4e512c9e2563f370220bd3b59c13f0c9d7454433"} Apr 22 19:26:21.065091 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.065038 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:21.068773 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.068752 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" Apr 22 19:26:21.079344 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.079276 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69848b4c78-7szq8" podStartSLOduration=1.2306937709999999 podStartE2EDuration="11.079262939s" podCreationTimestamp="2026-04-22 19:26:10 +0000 UTC" firstStartedPulling="2026-04-22 19:26:10.87179564 +0000 UTC m=+185.787849900" lastFinishedPulling="2026-04-22 19:26:20.72036481 +0000 UTC m=+195.636419068" observedRunningTime="2026-04-22 19:26:21.078086075 +0000 UTC m=+195.994140352" watchObservedRunningTime="2026-04-22 19:26:21.079262939 +0000 UTC m=+195.995317216" Apr 22 19:26:21.082324 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.082301 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-99qxx" Apr 22 19:26:21.092882 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.092713 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5cd52" podStartSLOduration=2.308398806 podStartE2EDuration="11.092695223s" podCreationTimestamp="2026-04-22 19:26:10 +0000 UTC" firstStartedPulling="2026-04-22 19:26:11.936221317 +0000 UTC m=+186.852275578" lastFinishedPulling="2026-04-22 19:26:20.72051773 +0000 UTC m=+195.636571995" observedRunningTime="2026-04-22 19:26:21.09226407 +0000 UTC m=+196.008318350" watchObservedRunningTime="2026-04-22 19:26:21.092695223 +0000 UTC m=+196.008749500" Apr 22 19:26:21.116178 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.116121 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-99qxx" podStartSLOduration=1.062771814 podStartE2EDuration="17.116103013s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:04.716830208 +0000 UTC m=+179.632884464" lastFinishedPulling="2026-04-22 19:26:20.770161405 +0000 UTC m=+195.686215663" observedRunningTime="2026-04-22 19:26:21.113577754 +0000 UTC m=+196.029632033" watchObservedRunningTime="2026-04-22 19:26:21.116103013 +0000 UTC m=+196.032157291" Apr 22 19:26:21.981291 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.981256 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdcj6"] Apr 22 19:26:21.985880 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.985855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:21.988601 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.988566 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:26:21.989817 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.989770 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:26:21.989817 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.989788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:26:21.990029 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.989851 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:26:21.990415 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.990396 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:26:21.990498 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.990471 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6ctbv\"" Apr 22 19:26:21.996328 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:21.996307 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdcj6"] Apr 22 19:26:22.160312 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.160277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2df0037e-915e-45bb-b309-13ac91a9d566-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.160789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.160370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpmc\" (UniqueName: \"kubernetes.io/projected/2df0037e-915e-45bb-b309-13ac91a9d566-kube-api-access-dlpmc\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.160789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.160410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.160789 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.160492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.261661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.261573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2df0037e-915e-45bb-b309-13ac91a9d566-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.261831 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.261680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpmc\" (UniqueName: \"kubernetes.io/projected/2df0037e-915e-45bb-b309-13ac91a9d566-kube-api-access-dlpmc\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.261831 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.261726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.261946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.261868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.262434 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.262406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2df0037e-915e-45bb-b309-13ac91a9d566-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.264503 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.264475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.264620 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.264569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2df0037e-915e-45bb-b309-13ac91a9d566-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.270823 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.270796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpmc\" (UniqueName: \"kubernetes.io/projected/2df0037e-915e-45bb-b309-13ac91a9d566-kube-api-access-dlpmc\") pod \"prometheus-operator-5676c8c784-jdcj6\" (UID: \"2df0037e-915e-45bb-b309-13ac91a9d566\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.298256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.298216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" Apr 22 19:26:22.434740 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:22.434705 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdcj6"] Apr 22 19:26:22.438607 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:22.438578 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df0037e_915e_45bb_b309_13ac91a9d566.slice/crio-b6951aeb91630fad5d16a26125242993f9940605517c4e286580fcf59d2af4a2 WatchSource:0}: Error finding container b6951aeb91630fad5d16a26125242993f9940605517c4e286580fcf59d2af4a2: Status 404 returned error can't find the container with id b6951aeb91630fad5d16a26125242993f9940605517c4e286580fcf59d2af4a2 Apr 22 19:26:23.071763 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:23.071725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" event={"ID":"2df0037e-915e-45bb-b309-13ac91a9d566","Type":"ContainerStarted","Data":"b6951aeb91630fad5d16a26125242993f9940605517c4e286580fcf59d2af4a2"} Apr 22 19:26:25.079120 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:25.079084 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" event={"ID":"2df0037e-915e-45bb-b309-13ac91a9d566","Type":"ContainerStarted","Data":"a699f75869b04d046b7a21e61cb29618e8ea8463f5b75b32f962ce7acefe063b"} Apr 22 19:26:25.079120 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:25.079120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" event={"ID":"2df0037e-915e-45bb-b309-13ac91a9d566","Type":"ContainerStarted","Data":"9318da486aad177c8802044d374fba68694c23cfbf677cd5e5e05a31679059e5"} Apr 22 19:26:25.101709 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:25.101654 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdcj6" podStartSLOduration=2.233480226 podStartE2EDuration="4.101636962s" podCreationTimestamp="2026-04-22 19:26:21 +0000 UTC" firstStartedPulling="2026-04-22 19:26:22.441165197 +0000 UTC m=+197.357219464" lastFinishedPulling="2026-04-22 19:26:24.309321935 +0000 UTC m=+199.225376200" observedRunningTime="2026-04-22 19:26:25.100811057 +0000 UTC m=+200.016865358" watchObservedRunningTime="2026-04-22 19:26:25.101636962 +0000 UTC m=+200.017691238" Apr 22 19:26:26.392795 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:26.392760 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d4fcf77b6-9bwd5"] Apr 22 19:26:26.393315 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:26.392962 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" podUID="c23b359a-a238-4cae-b9b5-646b1b984bcf" Apr 22 19:26:27.084887 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.084852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:26:27.089997 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.089975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:26:27.105051 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105023 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105188 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105067 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105188 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105102 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx89j\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105188 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105129 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105188 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105160 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105403 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105196 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105403 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105247 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca\") pod \"c23b359a-a238-4cae-b9b5-646b1b984bcf\" (UID: \"c23b359a-a238-4cae-b9b5-646b1b984bcf\") " Apr 22 19:26:27.105512 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105419 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:27.105628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105606 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:26:27.105826 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.105798 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:27.107892 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.107862 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:27.108047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.107927 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:27.108047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.107927 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j" (OuterVolumeSpecName: "kube-api-access-sx89j") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "kube-api-access-sx89j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:27.108165 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.108096 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c23b359a-a238-4cae-b9b5-646b1b984bcf" (UID: "c23b359a-a238-4cae-b9b5-646b1b984bcf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:27.205844 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205809 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c23b359a-a238-4cae-b9b5-646b1b984bcf-ca-trust-extracted\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.205844 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205844 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sx89j\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-kube-api-access-sx89j\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.206026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205860 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-image-registry-private-configuration\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.206026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205878 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-bound-sa-token\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.206026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205893 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c23b359a-a238-4cae-b9b5-646b1b984bcf-installation-pull-secrets\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.206026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205907 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-trusted-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.206026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.205919 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-certificates\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:27.367780 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.367702 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jlfs8"] Apr 22 19:26:27.426107 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.426073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.434973 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.434945 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:26:27.435101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.434993 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:26:27.435101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.435075 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v96mz\"" Apr 22 19:26:27.435314 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.435266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:26:27.509757 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.509724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-wtmp\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.509913 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.509771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-textfile\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.509913 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.509839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-sys\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.509898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.510018 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-metrics-client-ca\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510163 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.510107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf454\" (UniqueName: \"kubernetes.io/projected/af74d4e2-417f-4944-b66e-31be143b888b-kube-api-access-gf454\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.510167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-root\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510279 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.510213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.510332 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.510275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.611657 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-metrics-client-ca\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.611810 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf454\" (UniqueName: \"kubernetes.io/projected/af74d4e2-417f-4944-b66e-31be143b888b-kube-api-access-gf454\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.611810 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-root\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.611810 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.611810 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612048 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-wtmp\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612048 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-textfile\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612048 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-sys\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612048 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.611910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612303 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-metrics-client-ca\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612398 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-sys\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612457 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:27.612405 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:27.612506 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:27.612466 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls podName:af74d4e2-417f-4944-b66e-31be143b888b nodeName:}" failed. No retries permitted until 2026-04-22 19:26:28.112446816 +0000 UTC m=+203.028501075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls") pod "node-exporter-jlfs8" (UID: "af74d4e2-417f-4944-b66e-31be143b888b") : secret "node-exporter-tls" not found Apr 22 19:26:27.612562 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-wtmp\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612560 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af74d4e2-417f-4944-b66e-31be143b888b-root\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-textfile\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.612744 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.612719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.614518 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.614492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:27.623969 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:27.623905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf454\" (UniqueName: \"kubernetes.io/projected/af74d4e2-417f-4944-b66e-31be143b888b-kube-api-access-gf454\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:28.087853 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.087822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4fcf77b6-9bwd5" Apr 22 19:26:28.116385 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.116351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:28.116540 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:28.116490 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:28.116594 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:28.116550 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls podName:af74d4e2-417f-4944-b66e-31be143b888b nodeName:}" failed. No retries permitted until 2026-04-22 19:26:29.11653267 +0000 UTC m=+204.032586943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls") pod "node-exporter-jlfs8" (UID: "af74d4e2-417f-4944-b66e-31be143b888b") : secret "node-exporter-tls" not found Apr 22 19:26:28.125279 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.125255 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d4fcf77b6-9bwd5"] Apr 22 19:26:28.139482 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.139459 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d4fcf77b6-9bwd5"] Apr 22 19:26:28.217380 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.217350 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c23b359a-a238-4cae-b9b5-646b1b984bcf-registry-tls\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:28.530584 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.530505 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:28.566610 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.566576 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:28.566772 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.566756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.569576 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569544 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:26:28.569701 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:26:28.569701 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:26:28.569853 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569828 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:26:28.569958 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569939 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:26:28.570123 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.569975 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:26:28.570123 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.570059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:26:28.570123 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.570077 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:26:28.570123 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.570109 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mgbmz\"" Apr 22 19:26:28.570361 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.570061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:26:28.621316 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621456 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621456 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621456 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621414 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-web-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621747 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621747 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621747 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-out\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621747 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621969 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.621969 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.621792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjtl\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-kube-api-access-4cjtl\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722298 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722298 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-web-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722523 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-out\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjtl\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-kube-api-access-4cjtl\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.722828 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.722724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.723172 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:28.723064 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 19:26:28.723172 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:28.723133 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls podName:1af2773d-c45e-41c5-84c1-8141c91e1e38 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:29.223115349 +0000 UTC m=+204.139169611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1af2773d-c45e-41c5-84c1-8141c91e1e38") : secret "alertmanager-main-tls" not found Apr 22 19:26:28.724923 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.723747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.724923 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.724641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1af2773d-c45e-41c5-84c1-8141c91e1e38-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.725386 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.725314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-web-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.725724 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.725624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-out\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.725724 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.725637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-config-volume\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.725918 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.725894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.726350 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.726308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.727489 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.727466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.727840 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.727822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.727991 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.727969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:28.731971 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:28.731951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjtl\" (UniqueName: \"kubernetes.io/projected/1af2773d-c45e-41c5-84c1-8141c91e1e38-kube-api-access-4cjtl\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:29.125910 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.125879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:29.128510 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.128487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af74d4e2-417f-4944-b66e-31be143b888b-node-exporter-tls\") pod \"node-exporter-jlfs8\" (UID: \"af74d4e2-417f-4944-b66e-31be143b888b\") " pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:29.226946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.226907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:29.229546 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.229521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1af2773d-c45e-41c5-84c1-8141c91e1e38-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1af2773d-c45e-41c5-84c1-8141c91e1e38\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:29.237391 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.237365 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jlfs8" Apr 22 19:26:29.246284 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:29.246252 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf74d4e2_417f_4944_b66e_31be143b888b.slice/crio-3e826ab3d9b8fd11d66b16f2acaca5fc58f319d62b01f54065b362aad0bdaa33 WatchSource:0}: Error finding container 3e826ab3d9b8fd11d66b16f2acaca5fc58f319d62b01f54065b362aad0bdaa33: Status 404 returned error can't find the container with id 3e826ab3d9b8fd11d66b16f2acaca5fc58f319d62b01f54065b362aad0bdaa33 Apr 22 19:26:29.473961 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.473884 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:29.478837 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.478810 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:29.581090 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.580877 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23b359a-a238-4cae-b9b5-646b1b984bcf" path="/var/lib/kubelet/pods/c23b359a-a238-4cae-b9b5-646b1b984bcf/volumes" Apr 22 19:26:29.651495 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:29.651460 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:29.655467 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:29.655440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1af2773d_c45e_41c5_84c1_8141c91e1e38.slice/crio-df667a5a126c4518c171b165f79cfc92c3ea7afc73b913ccd86948f544dad5f5 WatchSource:0}: Error finding container df667a5a126c4518c171b165f79cfc92c3ea7afc73b913ccd86948f544dad5f5: Status 404 returned error can't find the container with id df667a5a126c4518c171b165f79cfc92c3ea7afc73b913ccd86948f544dad5f5 Apr 22 19:26:30.094492 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:30.094447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlfs8" event={"ID":"af74d4e2-417f-4944-b66e-31be143b888b","Type":"ContainerStarted","Data":"3e826ab3d9b8fd11d66b16f2acaca5fc58f319d62b01f54065b362aad0bdaa33"} Apr 22 19:26:30.095625 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:30.095584 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"df667a5a126c4518c171b165f79cfc92c3ea7afc73b913ccd86948f544dad5f5"} Apr 22 19:26:30.727617 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:30.725816 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:31.099352 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.099317 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlfs8" event={"ID":"af74d4e2-417f-4944-b66e-31be143b888b","Type":"ContainerStarted","Data":"2962898cc432ad446a01e38e152fb8c197f497b0214e54be6b67c625442edc45"} Apr 22 19:26:31.746913 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.746879 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-bcf47d79f-8569z"] Apr 22 19:26:31.755221 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.755158 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.757876 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.757858 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:26:31.758974 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.758953 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m9qns\"" Apr 22 19:26:31.759083 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.759049 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:26:31.759083 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.759060 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:26:31.759186 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.759092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a6vc73vgc4r8u\"" Apr 22 19:26:31.759186 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.759052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:26:31.761657 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.761640 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bcf47d79f-8569z"] Apr 22 19:26:31.852163 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vk9\" (UniqueName: \"kubernetes.io/projected/dd14d123-c500-4b7e-9c97-f1b3b16e8570-kube-api-access-q2vk9\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852276 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-tls\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852276 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dd14d123-c500-4b7e-9c97-f1b3b16e8570-audit-log\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852276 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-client-certs\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852394 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852394 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-client-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.852394 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.852376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-metrics-server-audit-profiles\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953425 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-metrics-server-audit-profiles\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953592 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vk9\" (UniqueName: \"kubernetes.io/projected/dd14d123-c500-4b7e-9c97-f1b3b16e8570-kube-api-access-q2vk9\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953592 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-tls\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953592 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dd14d123-c500-4b7e-9c97-f1b3b16e8570-audit-log\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953592 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-client-certs\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953796 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953796 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-client-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.953913 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.953888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dd14d123-c500-4b7e-9c97-f1b3b16e8570-audit-log\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.954409 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.954384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.954511 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.954429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dd14d123-c500-4b7e-9c97-f1b3b16e8570-metrics-server-audit-profiles\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.955951 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.955927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-client-certs\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.956363 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.956346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-secret-metrics-server-tls\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.956441 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.956425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd14d123-c500-4b7e-9c97-f1b3b16e8570-client-ca-bundle\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:31.961687 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:31.961666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vk9\" (UniqueName: \"kubernetes.io/projected/dd14d123-c500-4b7e-9c97-f1b3b16e8570-kube-api-access-q2vk9\") pod \"metrics-server-bcf47d79f-8569z\" (UID: \"dd14d123-c500-4b7e-9c97-f1b3b16e8570\") " pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:32.065144 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.065117 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:32.105094 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.105059 2578 generic.go:358] "Generic (PLEG): container finished" podID="af74d4e2-417f-4944-b66e-31be143b888b" containerID="2962898cc432ad446a01e38e152fb8c197f497b0214e54be6b67c625442edc45" exitCode=0 Apr 22 19:26:32.105320 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.105187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlfs8" event={"ID":"af74d4e2-417f-4944-b66e-31be143b888b","Type":"ContainerDied","Data":"2962898cc432ad446a01e38e152fb8c197f497b0214e54be6b67c625442edc45"} Apr 22 19:26:32.107329 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.107143 2578 generic.go:358] "Generic (PLEG): container finished" podID="1af2773d-c45e-41c5-84c1-8141c91e1e38" containerID="71d320bf1ad503cd6183316b0b5f3f45fb057bb26b47047ef5b1a85e9af37ff2" exitCode=0 Apr 22 19:26:32.107329 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.107182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerDied","Data":"71d320bf1ad503cd6183316b0b5f3f45fb057bb26b47047ef5b1a85e9af37ff2"} Apr 22 19:26:32.212359 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:32.212335 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bcf47d79f-8569z"] Apr 22 19:26:32.215199 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:32.215173 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd14d123_c500_4b7e_9c97_f1b3b16e8570.slice/crio-e449be576eba809decaf476bbb7454b1a81d2f88e592b6efa2c3845355b1b58a WatchSource:0}: Error finding container e449be576eba809decaf476bbb7454b1a81d2f88e592b6efa2c3845355b1b58a: Status 404 returned error can't find the container with id e449be576eba809decaf476bbb7454b1a81d2f88e592b6efa2c3845355b1b58a Apr 22 19:26:33.114492 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.114452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlfs8" event={"ID":"af74d4e2-417f-4944-b66e-31be143b888b","Type":"ContainerStarted","Data":"f3b4ba175a427cb4e157746a132ef0e346ce7a27b854da3e7410421c3724912e"} Apr 22 19:26:33.114492 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.114497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jlfs8" event={"ID":"af74d4e2-417f-4944-b66e-31be143b888b","Type":"ContainerStarted","Data":"2222ee15b2a91726656ff39e9ddd6b127363c8d02f325d359dd5907d71ed9c17"} Apr 22 19:26:33.115647 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.115602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" event={"ID":"dd14d123-c500-4b7e-9c97-f1b3b16e8570","Type":"ContainerStarted","Data":"e449be576eba809decaf476bbb7454b1a81d2f88e592b6efa2c3845355b1b58a"} Apr 22 19:26:33.205527 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.205482 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jlfs8" podStartSLOduration=4.938561976 podStartE2EDuration="6.205466631s" podCreationTimestamp="2026-04-22 19:26:27 +0000 UTC" firstStartedPulling="2026-04-22 19:26:29.248193798 +0000 UTC m=+204.164248055" lastFinishedPulling="2026-04-22 19:26:30.515098441 +0000 UTC m=+205.431152710" observedRunningTime="2026-04-22 19:26:33.137499835 +0000 UTC m=+208.053554122" watchObservedRunningTime="2026-04-22 19:26:33.205466631 +0000 UTC m=+208.121520908" Apr 22 19:26:33.206510 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.206482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:26:33.237956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.237933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:26:33.238110 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.238080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.245574 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.245541 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:26:33.264688 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.264661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.264818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.264706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.264818 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.264781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.265084 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.264926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7wh\" (UniqueName: \"kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.265084 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.264963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.265084 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.265024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.265084 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.265051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366643 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366798 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366798 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366798 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366952 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7wh\" (UniqueName: \"kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366952 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.366952 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.366871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.367442 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.367410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.367970 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.367923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.368936 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.368894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.369589 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.369538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.372525 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.372482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.372615 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.372525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.376016 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.375981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7wh\" (UniqueName: \"kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh\") pod \"console-5bf98dd59-9htqm\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:33.549614 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:33.549547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:34.569678 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:34.569633 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:26:34.573781 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:26:34.573755 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453ec5ff_1fb8_46a0_8cbd_c044041e2169.slice/crio-4e2d57322ec656bf4112d97871f5394e081027fb43388c7dfd541f042c838b12 WatchSource:0}: Error finding container 4e2d57322ec656bf4112d97871f5394e081027fb43388c7dfd541f042c838b12: Status 404 returned error can't find the container with id 4e2d57322ec656bf4112d97871f5394e081027fb43388c7dfd541f042c838b12 Apr 22 19:26:35.123056 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.123023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"7a4a946c39c67b67255732396b7558c600b15b00ed00a08632a63cd19ceae195"} Apr 22 19:26:35.123226 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.123063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"a9a14f3523e7cf14fcc0a4b1f6cfd2636b6d32f87529e1ec004c0f3acfe13552"} Apr 22 19:26:35.123226 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.123076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"64bf3e8651e66366bbfe697720ea0aaf038c466a9f7087b173eda4c48a4e2d7d"} Apr 22 19:26:35.123226 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.123086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"3e6e0299fe4701316e791a92b0504ba4f5fd57cb4feb6374a58cf9789cfb0cca"} Apr 22 19:26:35.123226 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.123095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"273aa24a53ea8a097ef7f05e0247ae8db7c85f20a62e1449d18fb20c4b106b95"} Apr 22 19:26:35.124298 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.124268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" event={"ID":"dd14d123-c500-4b7e-9c97-f1b3b16e8570","Type":"ContainerStarted","Data":"cf5533373aff160f79a46b7b666a89429c93c73139567bf7b46605dd35a067c5"} Apr 22 19:26:35.125594 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.125576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf98dd59-9htqm" event={"ID":"453ec5ff-1fb8-46a0-8cbd-c044041e2169","Type":"ContainerStarted","Data":"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910"} Apr 22 19:26:35.125675 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.125597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf98dd59-9htqm" event={"ID":"453ec5ff-1fb8-46a0-8cbd-c044041e2169","Type":"ContainerStarted","Data":"4e2d57322ec656bf4112d97871f5394e081027fb43388c7dfd541f042c838b12"} Apr 22 19:26:35.144271 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.144222 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" podStartSLOduration=1.908941032 podStartE2EDuration="4.144211791s" podCreationTimestamp="2026-04-22 19:26:31 +0000 UTC" firstStartedPulling="2026-04-22 19:26:32.217484871 +0000 UTC m=+207.133539130" lastFinishedPulling="2026-04-22 19:26:34.45275562 +0000 UTC m=+209.368809889" observedRunningTime="2026-04-22 19:26:35.143299488 +0000 UTC m=+210.059353771" watchObservedRunningTime="2026-04-22 19:26:35.144211791 +0000 UTC m=+210.060266068" Apr 22 19:26:35.166791 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:35.166756 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bf98dd59-9htqm" podStartSLOduration=2.166745857 podStartE2EDuration="2.166745857s" podCreationTimestamp="2026-04-22 19:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:35.165588312 +0000 UTC m=+210.081642590" watchObservedRunningTime="2026-04-22 19:26:35.166745857 +0000 UTC m=+210.082800135" Apr 22 19:26:37.133628 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:37.133592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1af2773d-c45e-41c5-84c1-8141c91e1e38","Type":"ContainerStarted","Data":"c45b4f3156bddd965eca4d917968236a6acc2b30f97953e9289eaf737a88352f"} Apr 22 19:26:37.161115 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:37.161062 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.690011448 podStartE2EDuration="9.161046705s" podCreationTimestamp="2026-04-22 19:26:28 +0000 UTC" firstStartedPulling="2026-04-22 19:26:29.657644041 +0000 UTC m=+204.573698303" lastFinishedPulling="2026-04-22 19:26:36.128679304 +0000 UTC m=+211.044733560" observedRunningTime="2026-04-22 19:26:37.159815229 +0000 UTC m=+212.075869506" watchObservedRunningTime="2026-04-22 19:26:37.161046705 +0000 UTC m=+212.077100983" Apr 22 19:26:43.550694 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:43.550654 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:43.550694 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:43.550699 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:43.555254 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:43.555231 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:44.155721 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:44.155692 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:26:52.066137 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:52.066101 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:52.066137 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:52.066148 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:26:54.495579 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.495514 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69848b4c78-7szq8" podUID="4eee5b39-e330-40e6-8763-99f0426fa62e" containerName="console" containerID="cri-o://d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4" gracePeriod=15 Apr 22 19:26:54.751292 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.751242 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69848b4c78-7szq8_4eee5b39-e330-40e6-8763-99f0426fa62e/console/0.log" Apr 22 19:26:54.751388 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.751300 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:54.837962 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.837936 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.837975 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838032 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838077 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838095 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838308 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838149 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dvps\" (UniqueName: \"kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps\") pod \"4eee5b39-e330-40e6-8763-99f0426fa62e\" (UID: \"4eee5b39-e330-40e6-8763-99f0426fa62e\") " Apr 22 19:26:54.838364 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838334 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config" (OuterVolumeSpecName: "console-config") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:54.838443 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838422 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-console-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:54.838500 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:54.838500 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.838452 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca" (OuterVolumeSpecName: "service-ca") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:54.840150 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.840127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:54.840218 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.840167 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:54.840258 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.840220 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps" (OuterVolumeSpecName: "kube-api-access-4dvps") pod "4eee5b39-e330-40e6-8763-99f0426fa62e" (UID: "4eee5b39-e330-40e6-8763-99f0426fa62e"). InnerVolumeSpecName "kube-api-access-4dvps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:54.939449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.939427 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-oauth-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:54.939449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.939448 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:54.939571 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.939457 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dvps\" (UniqueName: \"kubernetes.io/projected/4eee5b39-e330-40e6-8763-99f0426fa62e-kube-api-access-4dvps\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:54.939571 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.939467 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4eee5b39-e330-40e6-8763-99f0426fa62e-console-oauth-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:54.939571 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:54.939476 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4eee5b39-e330-40e6-8763-99f0426fa62e-service-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:26:55.181377 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181358 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69848b4c78-7szq8_4eee5b39-e330-40e6-8763-99f0426fa62e/console/0.log" Apr 22 19:26:55.181472 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181397 2578 generic.go:358] "Generic (PLEG): container finished" podID="4eee5b39-e330-40e6-8763-99f0426fa62e" containerID="d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4" exitCode=2 Apr 22 19:26:55.181472 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181469 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69848b4c78-7szq8" Apr 22 19:26:55.181550 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69848b4c78-7szq8" event={"ID":"4eee5b39-e330-40e6-8763-99f0426fa62e","Type":"ContainerDied","Data":"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4"} Apr 22 19:26:55.181598 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69848b4c78-7szq8" event={"ID":"4eee5b39-e330-40e6-8763-99f0426fa62e","Type":"ContainerDied","Data":"d8e699250a1d395b89923b460eeb4a443f9432af583c3e556f92c7a274616662"} Apr 22 19:26:55.181598 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.181566 2578 scope.go:117] "RemoveContainer" containerID="d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4" Apr 22 19:26:55.189119 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.189101 2578 scope.go:117] "RemoveContainer" containerID="d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4" Apr 22 19:26:55.189396 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:26:55.189375 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4\": container with ID starting with d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4 not found: ID does not exist" containerID="d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4" Apr 22 19:26:55.189457 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.189406 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4"} err="failed to get container status \"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4\": rpc error: code = NotFound desc = could not find container \"d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4\": container with ID starting with d899fe8b31796d50dc77c6b44eb071049e7dfbd269ee212a56650c52f84bc2a4 not found: ID does not exist" Apr 22 19:26:55.202611 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.202590 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:55.207539 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.207521 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69848b4c78-7szq8"] Apr 22 19:26:55.580602 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:55.580571 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eee5b39-e330-40e6-8763-99f0426fa62e" path="/var/lib/kubelet/pods/4eee5b39-e330-40e6-8763-99f0426fa62e/volumes" Apr 22 19:26:56.185652 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:56.185619 2578 generic.go:358] "Generic (PLEG): container finished" podID="e32d7faf-e9d5-42d2-a5b0-ac06cb089d17" containerID="55dbb3b92e6c26813a9f17df3ff3c70f9bd93aafe1bf616b4af4f58d7a6036cb" exitCode=0 Apr 22 19:26:56.185815 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:56.185694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" event={"ID":"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17","Type":"ContainerDied","Data":"55dbb3b92e6c26813a9f17df3ff3c70f9bd93aafe1bf616b4af4f58d7a6036cb"} Apr 22 19:26:56.186014 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:56.185982 2578 scope.go:117] "RemoveContainer" containerID="55dbb3b92e6c26813a9f17df3ff3c70f9bd93aafe1bf616b4af4f58d7a6036cb" Apr 22 19:26:57.190593 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:26:57.190559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khgg8" event={"ID":"e32d7faf-e9d5-42d2-a5b0-ac06cb089d17","Type":"ContainerStarted","Data":"5eac2ee5c9b328d497728fee39f9bd0a347be505ec1f4e31663368ba5fd45fb4"} Apr 22 19:27:01.372706 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:01.372674 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkjln_ddd68968-e706-4765-85c0-cc5f617ffb19/dns-node-resolver/0.log" Apr 22 19:27:12.071857 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:12.071829 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:27:12.076625 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:12.076600 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-bcf47d79f-8569z" Apr 22 19:27:17.514657 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:17.514617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:27:17.516867 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:17.516845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4-metrics-certs\") pod \"network-metrics-daemon-czpht\" (UID: \"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4\") " pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:27:17.678946 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:17.678919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:27:17.686320 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:17.686297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-czpht" Apr 22 19:27:17.806494 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:17.806401 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-czpht"] Apr 22 19:27:17.809110 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:27:17.809082 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79ea9d4_e3c7_4e4e_80eb_47a7ca3f62a4.slice/crio-9b0fd0cc1d0c39adc283607c3ba161ea44d4e2fb1a6f379ba188008a8b90e333 WatchSource:0}: Error finding container 9b0fd0cc1d0c39adc283607c3ba161ea44d4e2fb1a6f379ba188008a8b90e333: Status 404 returned error can't find the container with id 9b0fd0cc1d0c39adc283607c3ba161ea44d4e2fb1a6f379ba188008a8b90e333 Apr 22 19:27:18.249232 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:18.249188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-czpht" event={"ID":"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4","Type":"ContainerStarted","Data":"9b0fd0cc1d0c39adc283607c3ba161ea44d4e2fb1a6f379ba188008a8b90e333"} Apr 22 19:27:19.253573 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:19.253531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-czpht" event={"ID":"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4","Type":"ContainerStarted","Data":"94b4c74641e4f7271a9304a718b9380e553ad5e036334ea31dd1d9aac3d0e705"} Apr 22 19:27:19.253573 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:19.253585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-czpht" event={"ID":"a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4","Type":"ContainerStarted","Data":"54cfca3b43aacc0f811670ab52d8cd3da490ef5a01af9f6eb2971b3eae2c483e"} Apr 22 19:27:19.273729 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:19.273672 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-czpht" podStartSLOduration=253.394699156 podStartE2EDuration="4m14.273653815s" podCreationTimestamp="2026-04-22 19:23:05 +0000 UTC" firstStartedPulling="2026-04-22 19:27:17.811529155 +0000 UTC m=+252.727583417" lastFinishedPulling="2026-04-22 19:27:18.690483821 +0000 UTC m=+253.606538076" observedRunningTime="2026-04-22 19:27:19.271444234 +0000 UTC m=+254.187498522" watchObservedRunningTime="2026-04-22 19:27:19.273653815 +0000 UTC m=+254.189708092" Apr 22 19:27:43.957865 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:27:43.957826 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-56v9g" podUID="0b284839-b3dc-4bf0-b479-744c1da18b4b" Apr 22 19:27:44.322421 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:44.322392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-56v9g" Apr 22 19:27:47.531167 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.531134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:27:47.533323 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.533301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b284839-b3dc-4bf0-b479-744c1da18b4b-metrics-tls\") pod \"dns-default-56v9g\" (UID: \"0b284839-b3dc-4bf0-b479-744c1da18b4b\") " pod="openshift-dns/dns-default-56v9g" Apr 22 19:27:47.625629 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.625604 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:27:47.632026 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.631993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:27:47.633409 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.633392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-56v9g" Apr 22 19:27:47.634217 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.634163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75f18f37-b8b6-4514-90d1-259b37372b4b-cert\") pod \"ingress-canary-zx6g9\" (UID: \"75f18f37-b8b6-4514-90d1-259b37372b4b\") " pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:27:47.749744 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.749721 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-56v9g"] Apr 22 19:27:47.751674 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:27:47.751635 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b284839_b3dc_4bf0_b479_744c1da18b4b.slice/crio-11b607f10660fde966806bee0e05cb663f169b2bea71c0d32336483355596cec WatchSource:0}: Error finding container 11b607f10660fde966806bee0e05cb663f169b2bea71c0d32336483355596cec: Status 404 returned error can't find the container with id 11b607f10660fde966806bee0e05cb663f169b2bea71c0d32336483355596cec Apr 22 19:27:47.879111 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.879086 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:27:47.887163 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:47.887144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zx6g9" Apr 22 19:27:48.000785 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:48.000763 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zx6g9"] Apr 22 19:27:48.003101 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:27:48.003066 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f18f37_b8b6_4514_90d1_259b37372b4b.slice/crio-6f11153314750c0c7acfd4bd106bc3cb0dbcb3f688d9ae4b71e3d0c90e49d6a2 WatchSource:0}: Error finding container 6f11153314750c0c7acfd4bd106bc3cb0dbcb3f688d9ae4b71e3d0c90e49d6a2: Status 404 returned error can't find the container with id 6f11153314750c0c7acfd4bd106bc3cb0dbcb3f688d9ae4b71e3d0c90e49d6a2 Apr 22 19:27:48.334500 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:48.334461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zx6g9" event={"ID":"75f18f37-b8b6-4514-90d1-259b37372b4b","Type":"ContainerStarted","Data":"6f11153314750c0c7acfd4bd106bc3cb0dbcb3f688d9ae4b71e3d0c90e49d6a2"} Apr 22 19:27:48.335697 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:48.335671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-56v9g" event={"ID":"0b284839-b3dc-4bf0-b479-744c1da18b4b","Type":"ContainerStarted","Data":"11b607f10660fde966806bee0e05cb663f169b2bea71c0d32336483355596cec"} Apr 22 19:27:49.342210 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:49.342152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-56v9g" event={"ID":"0b284839-b3dc-4bf0-b479-744c1da18b4b","Type":"ContainerStarted","Data":"230bcfa50f2ed28655a81248f167e48db58474f7de25fba341f5ad6fd371aa7f"} Apr 22 19:27:50.346329 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:50.346297 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zx6g9" event={"ID":"75f18f37-b8b6-4514-90d1-259b37372b4b","Type":"ContainerStarted","Data":"b8576db6a64804a763e1d495b2cb4bb1939eed36f6639ee6bde5a33e195fa5a6"} Apr 22 19:27:50.347872 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:50.347843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-56v9g" event={"ID":"0b284839-b3dc-4bf0-b479-744c1da18b4b","Type":"ContainerStarted","Data":"dc22f19abe14d61b556fe0953df90d25dae2ebac7297bcc9e3409d190697c9b1"} Apr 22 19:27:50.347981 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:50.347948 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-56v9g" Apr 22 19:27:50.363217 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:50.363175 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zx6g9" podStartSLOduration=251.381417146 podStartE2EDuration="4m13.363163448s" podCreationTimestamp="2026-04-22 19:23:37 +0000 UTC" firstStartedPulling="2026-04-22 19:27:48.005048198 +0000 UTC m=+282.921102454" lastFinishedPulling="2026-04-22 19:27:49.986794497 +0000 UTC m=+284.902848756" observedRunningTime="2026-04-22 19:27:50.362150034 +0000 UTC m=+285.278204312" watchObservedRunningTime="2026-04-22 19:27:50.363163448 +0000 UTC m=+285.279217704" Apr 22 19:27:50.381729 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:50.381684 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-56v9g" podStartSLOduration=251.989992217 podStartE2EDuration="4m13.381669172s" podCreationTimestamp="2026-04-22 19:23:37 +0000 UTC" firstStartedPulling="2026-04-22 19:27:47.753432749 +0000 UTC m=+282.669487007" lastFinishedPulling="2026-04-22 19:27:49.145109703 +0000 UTC m=+284.061163962" observedRunningTime="2026-04-22 19:27:50.380932439 +0000 UTC m=+285.296986727" watchObservedRunningTime="2026-04-22 19:27:50.381669172 +0000 UTC m=+285.297723450" Apr 22 19:27:51.711551 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.710513 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx"] Apr 22 19:27:51.711551 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.710916 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eee5b39-e330-40e6-8763-99f0426fa62e" containerName="console" Apr 22 19:27:51.711551 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.710932 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eee5b39-e330-40e6-8763-99f0426fa62e" containerName="console" Apr 22 19:27:51.711551 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.711019 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4eee5b39-e330-40e6-8763-99f0426fa62e" containerName="console" Apr 22 19:27:51.714299 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.714279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.717097 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.717068 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-cftpq\"" Apr 22 19:27:51.717964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.717297 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:27:51.717964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.717496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:27:51.717964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.717568 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:27:51.717964 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.717830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:27:51.718236 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.718093 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:27:51.725633 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.725613 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:27:51.729750 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.729729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx"] Apr 22 19:27:51.760808 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.760772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-serving-certs-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.760956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.760834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.760956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.760874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-federate-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.760956 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.760900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.761160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.760967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-metrics-client-ca\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.761160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.761085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.761160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.761121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56pm\" (UniqueName: \"kubernetes.io/projected/92852648-bc0a-4f01-a363-49185a9c21e8-kube-api-access-b56pm\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.761283 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.761169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862083 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-federate-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862083 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-metrics-client-ca\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b56pm\" (UniqueName: \"kubernetes.io/projected/92852648-bc0a-4f01-a363-49185a9c21e8-kube-api-access-b56pm\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-serving-certs-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.862315 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.863079 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.862971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-metrics-client-ca\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.863079 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.863046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.863231 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.863099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92852648-bc0a-4f01-a363-49185a9c21e8-serving-certs-ca-bundle\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.864596 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.864575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-federate-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.864697 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.864636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.864994 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.864977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-telemeter-client-tls\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.865063 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.864990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/92852648-bc0a-4f01-a363-49185a9c21e8-secret-telemeter-client\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:51.871041 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:51.870995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56pm\" (UniqueName: \"kubernetes.io/projected/92852648-bc0a-4f01-a363-49185a9c21e8-kube-api-access-b56pm\") pod \"telemeter-client-6f758ff8cd-vl6jx\" (UID: \"92852648-bc0a-4f01-a363-49185a9c21e8\") " pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:52.026333 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:52.026279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" Apr 22 19:27:52.152206 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:52.152176 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx"] Apr 22 19:27:52.154047 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:27:52.154019 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92852648_bc0a_4f01_a363_49185a9c21e8.slice/crio-ec6249181c79db37a099a083e55025791d9f73eeaf3963e9fdaa3a67166604e0 WatchSource:0}: Error finding container ec6249181c79db37a099a083e55025791d9f73eeaf3963e9fdaa3a67166604e0: Status 404 returned error can't find the container with id ec6249181c79db37a099a083e55025791d9f73eeaf3963e9fdaa3a67166604e0 Apr 22 19:27:52.354550 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:52.354517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" event={"ID":"92852648-bc0a-4f01-a363-49185a9c21e8","Type":"ContainerStarted","Data":"ec6249181c79db37a099a083e55025791d9f73eeaf3963e9fdaa3a67166604e0"} Apr 22 19:27:54.361617 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:54.361576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" event={"ID":"92852648-bc0a-4f01-a363-49185a9c21e8","Type":"ContainerStarted","Data":"defc03ce2b7de51efa115634a8ee2f160ccf3e7147201bd37a9f6c78d88c7b9a"} Apr 22 19:27:54.361617 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:54.361619 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" event={"ID":"92852648-bc0a-4f01-a363-49185a9c21e8","Type":"ContainerStarted","Data":"e45d47cc941d87af350cec01d0f1bbca80aaaa907e658cf6414389d966e05ba9"} Apr 22 19:27:54.361988 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:54.361628 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" event={"ID":"92852648-bc0a-4f01-a363-49185a9c21e8","Type":"ContainerStarted","Data":"c8cd118dfcc268ed6b0244f496ea24662db8f3311e1d1b5872c67cc1afd3f807"} Apr 22 19:27:54.385434 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:27:54.385377 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6f758ff8cd-vl6jx" podStartSLOduration=1.709602861 podStartE2EDuration="3.385359041s" podCreationTimestamp="2026-04-22 19:27:51 +0000 UTC" firstStartedPulling="2026-04-22 19:27:52.155953723 +0000 UTC m=+287.072007978" lastFinishedPulling="2026-04-22 19:27:53.831709896 +0000 UTC m=+288.747764158" observedRunningTime="2026-04-22 19:27:54.384107645 +0000 UTC m=+289.300161935" watchObservedRunningTime="2026-04-22 19:27:54.385359041 +0000 UTC m=+289.301413319" Apr 22 19:28:00.352228 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:00.352199 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-56v9g" Apr 22 19:28:05.505624 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:05.505599 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:05.763203 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:05.763136 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:28:30.783537 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:30.783481 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5bf98dd59-9htqm" podUID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" containerName="console" containerID="cri-o://75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910" gracePeriod=15 Apr 22 19:28:31.010525 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.010503 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bf98dd59-9htqm_453ec5ff-1fb8-46a0-8cbd-c044041e2169/console/0.log" Apr 22 19:28:31.010652 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.010575 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:28:31.157476 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157451 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157490 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157519 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq7wh\" (UniqueName: \"kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157661 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157607 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157859 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157679 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157859 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157707 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca\") pod \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\" (UID: \"453ec5ff-1fb8-46a0-8cbd-c044041e2169\") " Apr 22 19:28:31.157968 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157877 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:31.157968 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.157917 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config" (OuterVolumeSpecName: "console-config") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:31.158255 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.158231 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca" (OuterVolumeSpecName: "service-ca") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:31.158432 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.158405 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:31.159759 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.159729 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh" (OuterVolumeSpecName: "kube-api-access-bq7wh") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "kube-api-access-bq7wh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:31.159854 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.159758 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:31.159979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.159951 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "453ec5ff-1fb8-46a0-8cbd-c044041e2169" (UID: "453ec5ff-1fb8-46a0-8cbd-c044041e2169"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:31.258359 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258332 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258359 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258355 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-service-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258486 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258365 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-oauth-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258486 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258374 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258486 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258384 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq7wh\" (UniqueName: \"kubernetes.io/projected/453ec5ff-1fb8-46a0-8cbd-c044041e2169-kube-api-access-bq7wh\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258486 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258391 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453ec5ff-1fb8-46a0-8cbd-c044041e2169-trusted-ca-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.258486 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.258401 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/453ec5ff-1fb8-46a0-8cbd-c044041e2169-console-oauth-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:28:31.467767 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467697 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bf98dd59-9htqm_453ec5ff-1fb8-46a0-8cbd-c044041e2169/console/0.log" Apr 22 19:28:31.467767 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467733 2578 generic.go:358] "Generic (PLEG): container finished" podID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" containerID="75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910" exitCode=2 Apr 22 19:28:31.467929 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf98dd59-9htqm" event={"ID":"453ec5ff-1fb8-46a0-8cbd-c044041e2169","Type":"ContainerDied","Data":"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910"} Apr 22 19:28:31.467929 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf98dd59-9htqm" event={"ID":"453ec5ff-1fb8-46a0-8cbd-c044041e2169","Type":"ContainerDied","Data":"4e2d57322ec656bf4112d97871f5394e081027fb43388c7dfd541f042c838b12"} Apr 22 19:28:31.467929 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467813 2578 scope.go:117] "RemoveContainer" containerID="75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910" Apr 22 19:28:31.467929 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.467810 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf98dd59-9htqm" Apr 22 19:28:31.476160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.476140 2578 scope.go:117] "RemoveContainer" containerID="75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910" Apr 22 19:28:31.476405 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:28:31.476387 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910\": container with ID starting with 75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910 not found: ID does not exist" containerID="75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910" Apr 22 19:28:31.476451 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.476413 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910"} err="failed to get container status \"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910\": rpc error: code = NotFound desc = could not find container \"75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910\": container with ID starting with 75220d0f4d94eb233892caf0a235691d42766b55f6439edc31c5dc4b2342a910 not found: ID does not exist" Apr 22 19:28:31.489698 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.489677 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:28:31.494015 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.493970 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5bf98dd59-9htqm"] Apr 22 19:28:31.579161 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:28:31.579136 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" path="/var/lib/kubelet/pods/453ec5ff-1fb8-46a0-8cbd-c044041e2169/volumes" Apr 22 19:33:00.629203 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.629169 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bfb8d57b4-c2rfn"] Apr 22 19:33:00.629683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.629438 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" containerName="console" Apr 22 19:33:00.629683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.629449 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" containerName="console" Apr 22 19:33:00.629683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.629511 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="453ec5ff-1fb8-46a0-8cbd-c044041e2169" containerName="console" Apr 22 19:33:00.632290 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.632274 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.640683 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640660 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:33:00.640824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640762 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:33:00.640824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640777 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-brsqm\"" Apr 22 19:33:00.640824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640771 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:33:00.640824 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640812 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:33:00.641038 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.640888 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:33:00.655364 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.655347 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:33:00.666112 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.666093 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bfb8d57b4-c2rfn"] Apr 22 19:33:00.752854 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-oauth-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.752992 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-trusted-ca-bundle\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.752992 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-oauth-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.752992 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.752992 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg6f\" (UniqueName: \"kubernetes.io/projected/6ead9e63-437a-4907-9f07-ddb2b92f185e-kube-api-access-xfg6f\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.753147 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.752995 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.753147 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.753094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-service-ca\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854145 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-trusted-ca-bundle\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854145 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-oauth-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854346 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854346 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg6f\" (UniqueName: \"kubernetes.io/projected/6ead9e63-437a-4907-9f07-ddb2b92f185e-kube-api-access-xfg6f\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854346 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854346 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-service-ca\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.854522 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-oauth-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.855041 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.854996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-service-ca\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.855041 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.855028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-trusted-ca-bundle\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.855221 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.855052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.855221 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.855128 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ead9e63-437a-4907-9f07-ddb2b92f185e-oauth-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.856695 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.856673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-oauth-config\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.856790 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.856686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ead9e63-437a-4907-9f07-ddb2b92f185e-console-serving-cert\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.863979 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.863961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg6f\" (UniqueName: \"kubernetes.io/projected/6ead9e63-437a-4907-9f07-ddb2b92f185e-kube-api-access-xfg6f\") pod \"console-5bfb8d57b4-c2rfn\" (UID: \"6ead9e63-437a-4907-9f07-ddb2b92f185e\") " pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:00.940282 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:00.940199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:01.056935 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:01.056897 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bfb8d57b4-c2rfn"] Apr 22 19:33:01.059662 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:33:01.059628 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ead9e63_437a_4907_9f07_ddb2b92f185e.slice/crio-e7d7eda028caf5c1e4edc93b9438c7c5f52f1f8eaf2cba6c9505cfafc0146ad7 WatchSource:0}: Error finding container e7d7eda028caf5c1e4edc93b9438c7c5f52f1f8eaf2cba6c9505cfafc0146ad7: Status 404 returned error can't find the container with id e7d7eda028caf5c1e4edc93b9438c7c5f52f1f8eaf2cba6c9505cfafc0146ad7 Apr 22 19:33:01.061449 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:01.061434 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:33:01.181388 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:01.181351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bfb8d57b4-c2rfn" event={"ID":"6ead9e63-437a-4907-9f07-ddb2b92f185e","Type":"ContainerStarted","Data":"f7ec638677b0434c3d355f39957289f7ba52a82d658452c1fe21a0e9427f77aa"} Apr 22 19:33:01.181543 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:01.181394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bfb8d57b4-c2rfn" event={"ID":"6ead9e63-437a-4907-9f07-ddb2b92f185e","Type":"ContainerStarted","Data":"e7d7eda028caf5c1e4edc93b9438c7c5f52f1f8eaf2cba6c9505cfafc0146ad7"} Apr 22 19:33:01.205725 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:01.205631 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bfb8d57b4-c2rfn" podStartSLOduration=1.205615806 podStartE2EDuration="1.205615806s" podCreationTimestamp="2026-04-22 19:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:01.204990646 +0000 UTC m=+596.121044924" watchObservedRunningTime="2026-04-22 19:33:01.205615806 +0000 UTC m=+596.121670082" Apr 22 19:33:10.941395 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:10.941300 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:10.941395 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:10.941356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:10.946188 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:10.946163 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:11.215796 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:11.215716 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bfb8d57b4-c2rfn" Apr 22 19:33:21.512127 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.512091 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t"] Apr 22 19:33:21.515277 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.515254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.518582 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.518557 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:33:21.519752 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.519736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:33:21.519752 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.519746 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:33:21.519898 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.519792 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:33:21.519898 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.519793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-bt9vt\"" Apr 22 19:33:21.529743 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.529724 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t"] Apr 22 19:33:21.624386 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.624355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5f4w\" (UniqueName: \"kubernetes.io/projected/f086447e-765f-4c2f-9a47-2e87763d1f8c-kube-api-access-f5f4w\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.624544 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.624430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f086447e-765f-4c2f-9a47-2e87763d1f8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.659328 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.659300 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m"] Apr 22 19:33:21.662613 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.662598 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.665211 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.665189 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:33:21.665502 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.665476 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:33:21.665626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.665610 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:33:21.665672 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.665619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:33:21.674599 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.674577 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m"] Apr 22 19:33:21.724945 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.724919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f086447e-765f-4c2f-9a47-2e87763d1f8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.725104 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.724967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5f4w\" (UniqueName: \"kubernetes.io/projected/f086447e-765f-4c2f-9a47-2e87763d1f8c-kube-api-access-f5f4w\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.727147 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.727122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f086447e-765f-4c2f-9a47-2e87763d1f8c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.733629 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.733606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5f4w\" (UniqueName: \"kubernetes.io/projected/f086447e-765f-4c2f-9a47-2e87763d1f8c-kube-api-access-f5f4w\") pod \"managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t\" (UID: \"f086447e-765f-4c2f-9a47-2e87763d1f8c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.825983 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.825956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.826130 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.825991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.826130 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.826112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.826205 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.826160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64910766-99da-4608-868b-3d40fe56d62a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.826205 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.826189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.826272 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.826231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knss8\" (UniqueName: \"kubernetes.io/projected/64910766-99da-4608-868b-3d40fe56d62a-kube-api-access-knss8\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.841912 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.841888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" Apr 22 19:33:21.927462 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.927428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.927619 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.927501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64910766-99da-4608-868b-3d40fe56d62a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.927619 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.927546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.928160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.928132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knss8\" (UniqueName: \"kubernetes.io/projected/64910766-99da-4608-868b-3d40fe56d62a-kube-api-access-knss8\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.928252 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.928211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.928252 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.928242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.928861 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.928795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64910766-99da-4608-868b-3d40fe56d62a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.930969 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.930945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-ca\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.931160 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.931138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.931260 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.931240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.931537 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.931522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64910766-99da-4608-868b-3d40fe56d62a-hub\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.935698 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.935676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knss8\" (UniqueName: \"kubernetes.io/projected/64910766-99da-4608-868b-3d40fe56d62a-kube-api-access-knss8\") pod \"cluster-proxy-proxy-agent-566468b78d-lxt7m\" (UID: \"64910766-99da-4608-868b-3d40fe56d62a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:21.960654 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.960627 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t"] Apr 22 19:33:21.963448 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:33:21.963425 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf086447e_765f_4c2f_9a47_2e87763d1f8c.slice/crio-63582fa1893a96b924e14cf493c9f6dd568b897aa25dc0e2af9d4338613428a0 WatchSource:0}: Error finding container 63582fa1893a96b924e14cf493c9f6dd568b897aa25dc0e2af9d4338613428a0: Status 404 returned error can't find the container with id 63582fa1893a96b924e14cf493c9f6dd568b897aa25dc0e2af9d4338613428a0 Apr 22 19:33:21.971108 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:21.971086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" Apr 22 19:33:22.088323 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:22.088301 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m"] Apr 22 19:33:22.090419 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:33:22.090389 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64910766_99da_4608_868b_3d40fe56d62a.slice/crio-a12f9d8f93359a1ef1f77c9f26549e19a1a6816a84a8006a68f33918be06a31d WatchSource:0}: Error finding container a12f9d8f93359a1ef1f77c9f26549e19a1a6816a84a8006a68f33918be06a31d: Status 404 returned error can't find the container with id a12f9d8f93359a1ef1f77c9f26549e19a1a6816a84a8006a68f33918be06a31d Apr 22 19:33:22.249278 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:22.249241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" event={"ID":"f086447e-765f-4c2f-9a47-2e87763d1f8c","Type":"ContainerStarted","Data":"63582fa1893a96b924e14cf493c9f6dd568b897aa25dc0e2af9d4338613428a0"} Apr 22 19:33:22.250269 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:22.250248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" event={"ID":"64910766-99da-4608-868b-3d40fe56d62a","Type":"ContainerStarted","Data":"a12f9d8f93359a1ef1f77c9f26549e19a1a6816a84a8006a68f33918be06a31d"} Apr 22 19:33:26.264272 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:26.264236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" event={"ID":"f086447e-765f-4c2f-9a47-2e87763d1f8c","Type":"ContainerStarted","Data":"a4a6999f3114c4f014fec595736abcc7929c5881ec6829f417364b9f3e4cb5bd"} Apr 22 19:33:26.265602 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:26.265579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" event={"ID":"64910766-99da-4608-868b-3d40fe56d62a","Type":"ContainerStarted","Data":"ac4110845632e9d149e600cbe3e837f7e22ed5b9a573a92d75f6c5636977bbf9"} Apr 22 19:33:26.279672 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:26.279624 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b7c8f99d9-k5s6t" podStartSLOduration=1.830398556 podStartE2EDuration="5.279610937s" podCreationTimestamp="2026-04-22 19:33:21 +0000 UTC" firstStartedPulling="2026-04-22 19:33:21.965210568 +0000 UTC m=+616.881264823" lastFinishedPulling="2026-04-22 19:33:25.414422936 +0000 UTC m=+620.330477204" observedRunningTime="2026-04-22 19:33:26.278323036 +0000 UTC m=+621.194377314" watchObservedRunningTime="2026-04-22 19:33:26.279610937 +0000 UTC m=+621.195665215" Apr 22 19:33:28.273111 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:28.273018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" event={"ID":"64910766-99da-4608-868b-3d40fe56d62a","Type":"ContainerStarted","Data":"d17237b559cea765aec99adb29c74c462a00121cb53bdb30fe6f2a1d57f3f2ab"} Apr 22 19:33:28.273111 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:28.273058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" event={"ID":"64910766-99da-4608-868b-3d40fe56d62a","Type":"ContainerStarted","Data":"dafa15cc2923a325e8feed88a5995acefc783807ea11644aace756ec12ad6b1e"} Apr 22 19:33:28.292436 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:33:28.292393 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-566468b78d-lxt7m" podStartSLOduration=1.531804709 podStartE2EDuration="7.292379102s" podCreationTimestamp="2026-04-22 19:33:21 +0000 UTC" firstStartedPulling="2026-04-22 19:33:22.092156332 +0000 UTC m=+617.008210587" lastFinishedPulling="2026-04-22 19:33:27.852730725 +0000 UTC m=+622.768784980" observedRunningTime="2026-04-22 19:33:28.289951949 +0000 UTC m=+623.206006226" watchObservedRunningTime="2026-04-22 19:33:28.292379102 +0000 UTC m=+623.208433379" Apr 22 19:36:07.944457 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.944420 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-vcpmv"] Apr 22 19:36:07.947588 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.947569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:07.950307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.950283 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:36:07.950476 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.950441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 19:36:07.951469 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.951446 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:36:07.951583 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.951446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-f5zsd\"" Apr 22 19:36:07.958659 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.958637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vcpmv"] Apr 22 19:36:07.990101 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.990079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcc6m\" (UniqueName: \"kubernetes.io/projected/5207707f-ec1b-4a01-b3f2-003f514ea403-kube-api-access-xcc6m\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:07.990199 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:07.990116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5207707f-ec1b-4a01-b3f2-003f514ea403-cert\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.091374 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.091338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcc6m\" (UniqueName: \"kubernetes.io/projected/5207707f-ec1b-4a01-b3f2-003f514ea403-kube-api-access-xcc6m\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.091528 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.091388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5207707f-ec1b-4a01-b3f2-003f514ea403-cert\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.093708 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.093680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5207707f-ec1b-4a01-b3f2-003f514ea403-cert\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.098792 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.098768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcc6m\" (UniqueName: \"kubernetes.io/projected/5207707f-ec1b-4a01-b3f2-003f514ea403-kube-api-access-xcc6m\") pod \"odh-model-controller-696fc77849-vcpmv\" (UID: \"5207707f-ec1b-4a01-b3f2-003f514ea403\") " pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.259143 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.259062 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:08.386129 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:36:08.386091 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5207707f_ec1b_4a01_b3f2_003f514ea403.slice/crio-17f42d80e6adeaaed719d010b04302e7053c34b8a83f098dc4a17a22f7a25c2e WatchSource:0}: Error finding container 17f42d80e6adeaaed719d010b04302e7053c34b8a83f098dc4a17a22f7a25c2e: Status 404 returned error can't find the container with id 17f42d80e6adeaaed719d010b04302e7053c34b8a83f098dc4a17a22f7a25c2e Apr 22 19:36:08.388314 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.388292 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vcpmv"] Apr 22 19:36:08.718895 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:08.718854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vcpmv" event={"ID":"5207707f-ec1b-4a01-b3f2-003f514ea403","Type":"ContainerStarted","Data":"17f42d80e6adeaaed719d010b04302e7053c34b8a83f098dc4a17a22f7a25c2e"} Apr 22 19:36:11.729762 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:11.729664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vcpmv" event={"ID":"5207707f-ec1b-4a01-b3f2-003f514ea403","Type":"ContainerStarted","Data":"e9a41b6daca6cd60fd054a4f12f4c4a59e808487d7f11b009f562e32e62e3b61"} Apr 22 19:36:11.730228 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:11.729808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:11.770174 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:11.770120 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-vcpmv" podStartSLOduration=1.796073252 podStartE2EDuration="4.770104933s" podCreationTimestamp="2026-04-22 19:36:07 +0000 UTC" firstStartedPulling="2026-04-22 19:36:08.387614233 +0000 UTC m=+783.303668490" lastFinishedPulling="2026-04-22 19:36:11.361645916 +0000 UTC m=+786.277700171" observedRunningTime="2026-04-22 19:36:11.766146786 +0000 UTC m=+786.682201055" watchObservedRunningTime="2026-04-22 19:36:11.770104933 +0000 UTC m=+786.686159207" Apr 22 19:36:22.734745 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:22.734712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-vcpmv" Apr 22 19:36:34.240533 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.240462 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:36:34.246060 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.246042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.248589 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.248569 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:36:34.248689 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.248572 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 19:36:34.251404 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.251382 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:36:34.433058 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.433021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75lnr\" (UniqueName: \"kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.433217 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.433113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.533518 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.533444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75lnr\" (UniqueName: \"kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.533630 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.533550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.533950 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.533931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.541826 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.541798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75lnr\" (UniqueName: \"kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr\") pod \"seaweedfs-tls-custom-ddd4dbfd-gcwck\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.555339 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.555316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:36:34.673378 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.673346 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:36:34.676559 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:36:34.676524 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26530f77_4902_4426_a5cb_1a2e5a233017.slice/crio-6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16 WatchSource:0}: Error finding container 6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16: Status 404 returned error can't find the container with id 6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16 Apr 22 19:36:34.804047 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:34.803939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" event={"ID":"26530f77-4902-4426-a5cb-1a2e5a233017","Type":"ContainerStarted","Data":"6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16"} Apr 22 19:36:37.814626 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:37.814591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" event={"ID":"26530f77-4902-4426-a5cb-1a2e5a233017","Type":"ContainerStarted","Data":"f47c3947da5e27f46c3cd91be981c888392a51408aae6f955ca74e8c05f2ede7"} Apr 22 19:36:37.830510 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:37.830452 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" podStartSLOduration=1.2459118120000001 podStartE2EDuration="3.830439933s" podCreationTimestamp="2026-04-22 19:36:34 +0000 UTC" firstStartedPulling="2026-04-22 19:36:34.677926467 +0000 UTC m=+809.593980721" lastFinishedPulling="2026-04-22 19:36:37.262454585 +0000 UTC m=+812.178508842" observedRunningTime="2026-04-22 19:36:37.829727779 +0000 UTC m=+812.745782060" watchObservedRunningTime="2026-04-22 19:36:37.830439933 +0000 UTC m=+812.746494210" Apr 22 19:36:38.681949 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:38.681917 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:36:39.820157 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:36:39.820090 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" podUID="26530f77-4902-4426-a5cb-1a2e5a233017" containerName="seaweedfs-tls-custom" containerID="cri-o://f47c3947da5e27f46c3cd91be981c888392a51408aae6f955ca74e8c05f2ede7" gracePeriod=30 Apr 22 19:37:07.828503 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:37:07.828476 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26530f77_4902_4426_a5cb_1a2e5a233017.slice/crio-6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16\": RecentStats: unable to find data in memory cache]" Apr 22 19:37:07.902120 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:07.902084 2578 generic.go:358] "Generic (PLEG): container finished" podID="26530f77-4902-4426-a5cb-1a2e5a233017" containerID="f47c3947da5e27f46c3cd91be981c888392a51408aae6f955ca74e8c05f2ede7" exitCode=0 Apr 22 19:37:07.902272 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:07.902114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" event={"ID":"26530f77-4902-4426-a5cb-1a2e5a233017","Type":"ContainerDied","Data":"f47c3947da5e27f46c3cd91be981c888392a51408aae6f955ca74e8c05f2ede7"} Apr 22 19:37:07.956212 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:07.956190 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:37:08.119075 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.119047 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data\") pod \"26530f77-4902-4426-a5cb-1a2e5a233017\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " Apr 22 19:37:08.119258 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.119120 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75lnr\" (UniqueName: \"kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr\") pod \"26530f77-4902-4426-a5cb-1a2e5a233017\" (UID: \"26530f77-4902-4426-a5cb-1a2e5a233017\") " Apr 22 19:37:08.120261 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.120237 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data" (OuterVolumeSpecName: "data") pod "26530f77-4902-4426-a5cb-1a2e5a233017" (UID: "26530f77-4902-4426-a5cb-1a2e5a233017"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:37:08.121342 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.121315 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr" (OuterVolumeSpecName: "kube-api-access-75lnr") pod "26530f77-4902-4426-a5cb-1a2e5a233017" (UID: "26530f77-4902-4426-a5cb-1a2e5a233017"). InnerVolumeSpecName "kube-api-access-75lnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:37:08.220420 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.220389 2578 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/26530f77-4902-4426-a5cb-1a2e5a233017-data\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:37:08.220420 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.220413 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75lnr\" (UniqueName: \"kubernetes.io/projected/26530f77-4902-4426-a5cb-1a2e5a233017-kube-api-access-75lnr\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 22 19:37:08.906487 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.906445 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" event={"ID":"26530f77-4902-4426-a5cb-1a2e5a233017","Type":"ContainerDied","Data":"6304048ff49f67b6226d57916b62f194b16efa0b1dfd794c46e47a341f096b16"} Apr 22 19:37:08.906487 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.906463 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck" Apr 22 19:37:08.906487 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.906486 2578 scope.go:117] "RemoveContainer" containerID="f47c3947da5e27f46c3cd91be981c888392a51408aae6f955ca74e8c05f2ede7" Apr 22 19:37:08.927077 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.927048 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:37:08.928737 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:08.928714 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-gcwck"] Apr 22 19:37:09.579545 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:09.579505 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26530f77-4902-4426-a5cb-1a2e5a233017" path="/var/lib/kubelet/pods/26530f77-4902-4426-a5cb-1a2e5a233017/volumes" Apr 22 19:37:19.270821 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.270783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-jnbph"] Apr 22 19:37:19.271219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.271134 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26530f77-4902-4426-a5cb-1a2e5a233017" containerName="seaweedfs-tls-custom" Apr 22 19:37:19.271219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.271146 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="26530f77-4902-4426-a5cb-1a2e5a233017" containerName="seaweedfs-tls-custom" Apr 22 19:37:19.271219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.271211 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="26530f77-4902-4426-a5cb-1a2e5a233017" containerName="seaweedfs-tls-custom" Apr 22 19:37:19.276462 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.276444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.282585 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.282556 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:37:19.282793 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.282665 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 19:37:19.282793 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.282566 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 22 19:37:19.285516 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.285483 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-jnbph"] Apr 22 19:37:19.413845 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.413801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.414029 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.413883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4zkx\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-kube-api-access-g4zkx\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.414029 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.413935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18351316-f44f-4077-917c-b5cc87f65bcf-data\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.514433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.514391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4zkx\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-kube-api-access-g4zkx\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.514433 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.514438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18351316-f44f-4077-917c-b5cc87f65bcf-data\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.514664 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.514476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.514858 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.514836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18351316-f44f-4077-917c-b5cc87f65bcf-data\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.516845 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.516828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.526320 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.526261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4zkx\" (UniqueName: \"kubernetes.io/projected/18351316-f44f-4077-917c-b5cc87f65bcf-kube-api-access-g4zkx\") pod \"seaweedfs-tls-serving-7fd5766db9-jnbph\" (UID: \"18351316-f44f-4077-917c-b5cc87f65bcf\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.593752 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.593720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" Apr 22 19:37:19.713307 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.713189 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-jnbph"] Apr 22 19:37:19.716205 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:37:19.716170 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18351316_f44f_4077_917c_b5cc87f65bcf.slice/crio-2b269c8892eacd47b77d1cbe768b92139503e49b9691df3e2267b4d8b23b15cb WatchSource:0}: Error finding container 2b269c8892eacd47b77d1cbe768b92139503e49b9691df3e2267b4d8b23b15cb: Status 404 returned error can't find the container with id 2b269c8892eacd47b77d1cbe768b92139503e49b9691df3e2267b4d8b23b15cb Apr 22 19:37:19.942852 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:19.942813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" event={"ID":"18351316-f44f-4077-917c-b5cc87f65bcf","Type":"ContainerStarted","Data":"2b269c8892eacd47b77d1cbe768b92139503e49b9691df3e2267b4d8b23b15cb"} Apr 22 19:37:20.947435 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:20.947400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" event={"ID":"18351316-f44f-4077-917c-b5cc87f65bcf","Type":"ContainerStarted","Data":"7cdd460bb05042e1c49d44119031177a93f9b5cb47b03bc485c5c4e58dfaebd5"} Apr 22 19:37:20.964039 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:37:20.963977 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-jnbph" podStartSLOduration=1.715813989 podStartE2EDuration="1.963964631s" podCreationTimestamp="2026-04-22 19:37:19 +0000 UTC" firstStartedPulling="2026-04-22 19:37:19.717518178 +0000 UTC m=+854.633572432" lastFinishedPulling="2026-04-22 19:37:19.965668806 +0000 UTC m=+854.881723074" observedRunningTime="2026-04-22 19:37:20.962129722 +0000 UTC m=+855.878184000" watchObservedRunningTime="2026-04-22 19:37:20.963964631 +0000 UTC m=+855.880018908" Apr 22 19:40:29.050701 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.050668 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:40:29.053959 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.053942 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:40:29.056316 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.056297 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5f4v\"" Apr 22 19:40:29.063318 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.063287 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:40:29.063939 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.063922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:40:29.185411 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.185386 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:40:29.187991 ip-10-0-141-16 kubenswrapper[2578]: W0422 19:40:29.187962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc28a00_19c7_4ad6_8a0b_eb3d9959f043.slice/crio-602fea144229aecdacebc9c45522bfc903a602205245fd79709d64455e1ca229 WatchSource:0}: Error finding container 602fea144229aecdacebc9c45522bfc903a602205245fd79709d64455e1ca229: Status 404 returned error can't find the container with id 602fea144229aecdacebc9c45522bfc903a602205245fd79709d64455e1ca229 Apr 22 19:40:29.189793 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.189774 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:40:29.497114 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:29.497076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" event={"ID":"3dc28a00-19c7-4ad6-8a0b-eb3d9959f043","Type":"ContainerStarted","Data":"602fea144229aecdacebc9c45522bfc903a602205245fd79709d64455e1ca229"} Apr 22 19:40:30.502031 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:30.501916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" event={"ID":"3dc28a00-19c7-4ad6-8a0b-eb3d9959f043","Type":"ContainerStarted","Data":"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7"} Apr 22 19:40:30.502438 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:30.502092 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:40:30.504053 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:30.504032 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:40:30.526892 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:40:30.526850 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" podStartSLOduration=0.577274392 podStartE2EDuration="1.526836643s" podCreationTimestamp="2026-04-22 19:40:29 +0000 UTC" firstStartedPulling="2026-04-22 19:40:29.189910035 +0000 UTC m=+1044.105964289" lastFinishedPulling="2026-04-22 19:40:30.139472285 +0000 UTC m=+1045.055526540" observedRunningTime="2026-04-22 19:40:30.525182895 +0000 UTC m=+1045.441237184" watchObservedRunningTime="2026-04-22 19:40:30.526836643 +0000 UTC m=+1045.442890919" Apr 22 19:41:54.137760 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:54.137722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-zn759_3dc28a00-19c7-4ad6-8a0b-eb3d9959f043/kserve-container/0.log" Apr 22 19:41:54.639921 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:54.639882 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:41:54.640219 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:54.640193 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" podUID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" containerName="kserve-container" containerID="cri-o://99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7" gracePeriod=30 Apr 22 19:41:54.881948 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:54.881925 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:41:55.741804 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.741770 2578 generic.go:358] "Generic (PLEG): container finished" podID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" containerID="99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7" exitCode=2 Apr 22 19:41:55.742256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.741837 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" Apr 22 19:41:55.742256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.741855 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" event={"ID":"3dc28a00-19c7-4ad6-8a0b-eb3d9959f043","Type":"ContainerDied","Data":"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7"} Apr 22 19:41:55.742256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.741890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759" event={"ID":"3dc28a00-19c7-4ad6-8a0b-eb3d9959f043","Type":"ContainerDied","Data":"602fea144229aecdacebc9c45522bfc903a602205245fd79709d64455e1ca229"} Apr 22 19:41:55.742256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.741905 2578 scope.go:117] "RemoveContainer" containerID="99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7" Apr 22 19:41:55.749534 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.749518 2578 scope.go:117] "RemoveContainer" containerID="99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7" Apr 22 19:41:55.749798 ip-10-0-141-16 kubenswrapper[2578]: E0422 19:41:55.749779 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7\": container with ID starting with 99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7 not found: ID does not exist" containerID="99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7" Apr 22 19:41:55.749890 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.749804 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7"} err="failed to get container status \"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7\": rpc error: code = NotFound desc = could not find container \"99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7\": container with ID starting with 99cd94261e733764780f6e5ce7af1e6b724c74b1e69adc1a27e45e4698a31fa7 not found: ID does not exist" Apr 22 19:41:55.759440 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.759410 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:41:55.764256 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:55.764237 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-zn759"] Apr 22 19:41:57.578753 ip-10-0-141-16 kubenswrapper[2578]: I0422 19:41:57.578716 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" path="/var/lib/kubelet/pods/3dc28a00-19c7-4ad6-8a0b-eb3d9959f043/volumes" Apr 22 20:25:51.665381 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.665350 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9c5pk/must-gather-v7dc5"] Apr 22 20:25:51.665891 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.665664 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" containerName="kserve-container" Apr 22 20:25:51.665891 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.665675 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" containerName="kserve-container" Apr 22 20:25:51.665891 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.665725 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dc28a00-19c7-4ad6-8a0b-eb3d9959f043" containerName="kserve-container" Apr 22 20:25:51.668744 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.668728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.671173 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.671144 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"openshift-service-ca.crt\"" Apr 22 20:25:51.671286 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.671202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9c5pk\"/\"default-dockercfg-lz96s\"" Apr 22 20:25:51.671286 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.671227 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"kube-root-ca.crt\"" Apr 22 20:25:51.679095 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.679075 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/must-gather-v7dc5"] Apr 22 20:25:51.801078 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.801049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxbw9\" (UniqueName: \"kubernetes.io/projected/55cd3d00-c6e8-4583-90ee-bcaf40b49525-kube-api-access-vxbw9\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.801211 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.801096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55cd3d00-c6e8-4583-90ee-bcaf40b49525-must-gather-output\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.901809 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.901778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55cd3d00-c6e8-4583-90ee-bcaf40b49525-must-gather-output\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.901952 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.901845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxbw9\" (UniqueName: \"kubernetes.io/projected/55cd3d00-c6e8-4583-90ee-bcaf40b49525-kube-api-access-vxbw9\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.902141 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.902119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55cd3d00-c6e8-4583-90ee-bcaf40b49525-must-gather-output\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.909666 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.909633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxbw9\" (UniqueName: \"kubernetes.io/projected/55cd3d00-c6e8-4583-90ee-bcaf40b49525-kube-api-access-vxbw9\") pod \"must-gather-v7dc5\" (UID: \"55cd3d00-c6e8-4583-90ee-bcaf40b49525\") " pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:51.978429 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:51.978348 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" Apr 22 20:25:52.097278 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:52.097213 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/must-gather-v7dc5"] Apr 22 20:25:52.099437 ip-10-0-141-16 kubenswrapper[2578]: W0422 20:25:52.099408 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55cd3d00_c6e8_4583_90ee_bcaf40b49525.slice/crio-0f7aa6a9181b28703b4ca50a5b46e9be0a6e12373c07c0803a5336510ba6b888 WatchSource:0}: Error finding container 0f7aa6a9181b28703b4ca50a5b46e9be0a6e12373c07c0803a5336510ba6b888: Status 404 returned error can't find the container with id 0f7aa6a9181b28703b4ca50a5b46e9be0a6e12373c07c0803a5336510ba6b888 Apr 22 20:25:52.101190 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:52.101174 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:25:52.241211 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:52.241128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" event={"ID":"55cd3d00-c6e8-4583-90ee-bcaf40b49525","Type":"ContainerStarted","Data":"0f7aa6a9181b28703b4ca50a5b46e9be0a6e12373c07c0803a5336510ba6b888"} Apr 22 20:25:53.245684 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:53.245656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" event={"ID":"55cd3d00-c6e8-4583-90ee-bcaf40b49525","Type":"ContainerStarted","Data":"0b1e4ac864bdd2e97192141c0c0e66f9fdfbd3ae0b1da1cdd8375e31461b52f5"} Apr 22 20:25:54.253696 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:54.253646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" event={"ID":"55cd3d00-c6e8-4583-90ee-bcaf40b49525","Type":"ContainerStarted","Data":"888c7a3023cac19c9e1cff35b32ac6183b70542cf3ebaed5a2f2984ceefc7910"} Apr 22 20:25:54.271080 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:54.270978 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9c5pk/must-gather-v7dc5" podStartSLOduration=2.353125642 podStartE2EDuration="3.270959306s" podCreationTimestamp="2026-04-22 20:25:51 +0000 UTC" firstStartedPulling="2026-04-22 20:25:52.10129346 +0000 UTC m=+3767.017347714" lastFinishedPulling="2026-04-22 20:25:53.019127111 +0000 UTC m=+3767.935181378" observedRunningTime="2026-04-22 20:25:54.269117741 +0000 UTC m=+3769.185172042" watchObservedRunningTime="2026-04-22 20:25:54.270959306 +0000 UTC m=+3769.187013584" Apr 22 20:25:54.462343 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:54.462313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rgfwb_13bf1528-14c5-43a6-a2a9-60cf081b25b0/global-pull-secret-syncer/0.log" Apr 22 20:25:54.710800 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:54.710771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jth69_57a867c3-e773-4882-a1b2-dc753d0d39ef/konnectivity-agent/0.log" Apr 22 20:25:54.829670 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:54.829638 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-16.ec2.internal_237efac7542ae805317afa8331e5e27b/haproxy/0.log" Apr 22 20:25:58.186302 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.186267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/alertmanager/0.log" Apr 22 20:25:58.209927 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.209892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/config-reloader/0.log" Apr 22 20:25:58.232685 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.232608 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/kube-rbac-proxy-web/0.log" Apr 22 20:25:58.260176 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.260099 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/kube-rbac-proxy/0.log" Apr 22 20:25:58.285989 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.285955 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/kube-rbac-proxy-metric/0.log" Apr 22 20:25:58.310684 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.310652 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/prom-label-proxy/0.log" Apr 22 20:25:58.333185 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.333105 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1af2773d-c45e-41c5-84c1-8141c91e1e38/init-config-reloader/0.log" Apr 22 20:25:58.487748 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.487669 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-bcf47d79f-8569z_dd14d123-c500-4b7e-9c97-f1b3b16e8570/metrics-server/0.log" Apr 22 20:25:58.613609 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.613581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlfs8_af74d4e2-417f-4944-b66e-31be143b888b/node-exporter/0.log" Apr 22 20:25:58.641781 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.641756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlfs8_af74d4e2-417f-4944-b66e-31be143b888b/kube-rbac-proxy/0.log" Apr 22 20:25:58.663848 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:58.663804 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jlfs8_af74d4e2-417f-4944-b66e-31be143b888b/init-textfile/0.log" Apr 22 20:25:59.010155 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.010121 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jdcj6_2df0037e-915e-45bb-b309-13ac91a9d566/prometheus-operator/0.log" Apr 22 20:25:59.031075 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.031039 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jdcj6_2df0037e-915e-45bb-b309-13ac91a9d566/kube-rbac-proxy/0.log" Apr 22 20:25:59.058102 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.058072 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-5cd52_7dd66eca-bc70-412c-a906-24c51d1c9fa2/prometheus-operator-admission-webhook/0.log" Apr 22 20:25:59.090148 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.090121 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f758ff8cd-vl6jx_92852648-bc0a-4f01-a363-49185a9c21e8/telemeter-client/0.log" Apr 22 20:25:59.132792 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.132761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f758ff8cd-vl6jx_92852648-bc0a-4f01-a363-49185a9c21e8/reload/0.log" Apr 22 20:25:59.154655 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:25:59.154627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f758ff8cd-vl6jx_92852648-bc0a-4f01-a363-49185a9c21e8/kube-rbac-proxy/0.log" Apr 22 20:26:00.381935 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:00.381906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mkkg8_f9013b10-84e5-4801-a992-369ae0ce0e83/networking-console-plugin/0.log" Apr 22 20:26:01.143621 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.143594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bfb8d57b4-c2rfn_6ead9e63-437a-4907-9f07-ddb2b92f185e/console/0.log" Apr 22 20:26:01.189053 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.189021 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-99qxx_39b0850e-a695-4553-ac06-dff279c97d24/download-server/0.log" Apr 22 20:26:01.398659 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.398569 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9"] Apr 22 20:26:01.403101 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.403081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.411654 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.411633 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9"] Apr 22 20:26:01.504314 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.504280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-lib-modules\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.504483 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.504335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-podres\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.504483 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.504390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-sys\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.504483 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.504424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpswg\" (UniqueName: \"kubernetes.io/projected/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-kube-api-access-zpswg\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.504630 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.504499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-proc\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605111 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-lib-modules\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605262 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-podres\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605262 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-sys\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605350 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-podres\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605350 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-lib-modules\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605350 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-sys\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605350 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpswg\" (UniqueName: \"kubernetes.io/projected/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-kube-api-access-zpswg\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605350 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-proc\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.605530 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.605434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-proc\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.613206 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.613181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpswg\" (UniqueName: \"kubernetes.io/projected/3d77ea96-c1d7-4ef4-852c-2edb48060c1b-kube-api-access-zpswg\") pod \"perf-node-gather-daemonset-r5vj9\" (UID: \"3d77ea96-c1d7-4ef4-852c-2edb48060c1b\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.715131 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.715056 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:01.842487 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:01.842435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9"] Apr 22 20:26:01.845398 ip-10-0-141-16 kubenswrapper[2578]: W0422 20:26:01.845370 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3d77ea96_c1d7_4ef4_852c_2edb48060c1b.slice/crio-08b5028c264cdd9b85512fde0d2bf0f18e0829abfe350f0c2a7090f30037111a WatchSource:0}: Error finding container 08b5028c264cdd9b85512fde0d2bf0f18e0829abfe350f0c2a7090f30037111a: Status 404 returned error can't find the container with id 08b5028c264cdd9b85512fde0d2bf0f18e0829abfe350f0c2a7090f30037111a Apr 22 20:26:02.258512 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.258466 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-56v9g_0b284839-b3dc-4bf0-b479-744c1da18b4b/dns/0.log" Apr 22 20:26:02.278965 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.278928 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-56v9g_0b284839-b3dc-4bf0-b479-744c1da18b4b/kube-rbac-proxy/0.log" Apr 22 20:26:02.288969 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.288938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" event={"ID":"3d77ea96-c1d7-4ef4-852c-2edb48060c1b","Type":"ContainerStarted","Data":"35245c5fa0d24997400a575e7293105c2c5e259bd7048df9802a918718c1c206"} Apr 22 20:26:02.289132 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.288973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" event={"ID":"3d77ea96-c1d7-4ef4-852c-2edb48060c1b","Type":"ContainerStarted","Data":"08b5028c264cdd9b85512fde0d2bf0f18e0829abfe350f0c2a7090f30037111a"} Apr 22 20:26:02.289132 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.289048 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:02.306152 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.306107 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" podStartSLOduration=1.306093661 podStartE2EDuration="1.306093661s" podCreationTimestamp="2026-04-22 20:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:26:02.304595286 +0000 UTC m=+3777.220649564" watchObservedRunningTime="2026-04-22 20:26:02.306093661 +0000 UTC m=+3777.222147937" Apr 22 20:26:02.415802 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.415775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rkjln_ddd68968-e706-4765-85c0-cc5f617ffb19/dns-node-resolver/0.log" Apr 22 20:26:02.858318 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:02.858277 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-47vkz_6ef064a5-78b1-49a5-a46f-8d155af983ba/node-ca/0.log" Apr 22 20:26:03.928609 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:03.928575 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zx6g9_75f18f37-b8b6-4514-90d1-259b37372b4b/serve-healthcheck-canary/0.log" Apr 22 20:26:04.342289 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:04.342258 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ln9lz_67e410f9-b014-4163-a371-50beadc36300/kube-rbac-proxy/0.log" Apr 22 20:26:04.362122 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:04.362088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ln9lz_67e410f9-b014-4163-a371-50beadc36300/exporter/0.log" Apr 22 20:26:04.382540 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:04.382516 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ln9lz_67e410f9-b014-4163-a371-50beadc36300/extractor/0.log" Apr 22 20:26:06.868805 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:06.868773 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-vcpmv_5207707f-ec1b-4a01-b3f2-003f514ea403/manager/0.log" Apr 22 20:26:07.003912 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:07.003877 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-jnbph_18351316-f44f-4077-917c-b5cc87f65bcf/seaweedfs-tls-serving/0.log" Apr 22 20:26:08.302874 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:08.302845 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-r5vj9" Apr 22 20:26:11.275065 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:11.275034 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-khgg8_e32d7faf-e9d5-42d2-a5b0-ac06cb089d17/kube-storage-version-migrator-operator/1.log" Apr 22 20:26:11.276896 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:11.276870 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-khgg8_e32d7faf-e9d5-42d2-a5b0-ac06cb089d17/kube-storage-version-migrator-operator/0.log" Apr 22 20:26:12.112728 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.112693 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-66jvk_f5fd4978-3887-4451-945f-2523ac01e21d/kube-multus/0.log" Apr 22 20:26:12.137873 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.137847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/kube-multus-additional-cni-plugins/0.log" Apr 22 20:26:12.159217 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.159193 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/egress-router-binary-copy/0.log" Apr 22 20:26:12.180522 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.180498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/cni-plugins/0.log" Apr 22 20:26:12.200702 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.200664 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/bond-cni-plugin/0.log" Apr 22 20:26:12.221091 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.221057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/routeoverride-cni/0.log" Apr 22 20:26:12.241812 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.241785 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/whereabouts-cni-bincopy/0.log" Apr 22 20:26:12.262662 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.262629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grm6t_ef0d702b-9f81-4046-b801-085bdfdf12b5/whereabouts-cni/0.log" Apr 22 20:26:12.668082 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.668047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-czpht_a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4/network-metrics-daemon/0.log" Apr 22 20:26:12.687921 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:12.687898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-czpht_a79ea9d4-e3c7-4e4e-80eb-47a7ca3f62a4/kube-rbac-proxy/0.log" Apr 22 20:26:13.905949 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:13.905914 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/ovn-controller/0.log" Apr 22 20:26:13.960715 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:13.960684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/ovn-acl-logging/0.log" Apr 22 20:26:13.986336 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:13.986310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/kube-rbac-proxy-node/0.log" Apr 22 20:26:14.010564 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:14.010536 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:26:14.030626 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:14.030603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/northd/0.log" Apr 22 20:26:14.051570 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:14.051545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/nbdb/0.log" Apr 22 20:26:14.072308 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:14.072286 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/sbdb/0.log" Apr 22 20:26:14.289781 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:14.289712 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cz572_721dc9c4-46d2-43f9-960d-7a7ecd3081a9/ovnkube-controller/0.log" Apr 22 20:26:15.686470 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:15.686431 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-spzjt_115a7622-6567-4b7d-83ff-39248615e827/network-check-target-container/0.log" Apr 22 20:26:16.557703 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:16.557671 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mncb2_cdc22288-9935-403f-8e99-11cb3daf1c99/iptables-alerter/0.log" Apr 22 20:26:17.285657 ip-10-0-141-16 kubenswrapper[2578]: I0422 20:26:17.285627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wr9b8_63bd826c-675d-4901-ac56-91d345994e80/tuned/0.log"