Apr 22 14:15:25.043089 ip-10-0-139-83 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:25.606562 ip-10-0-139-83 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.606562 ip-10-0-139-83 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:25.606562 ip-10-0-139-83 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.606562 ip-10-0-139-83 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:25.606562 ip-10-0-139-83 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.607387 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.607294 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613747 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613766 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613771 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613774 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613778 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.613772 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613781 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613784 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613788 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613791 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613794 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613797 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613800 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613802 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613805 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613808 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613811 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613826 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613831 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613835 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613839 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613842 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613846 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613849 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613853 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613859 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.614013 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613863 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613866 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613868 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613871 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613874 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613877 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613880 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613883 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613885 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613888 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613891 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613894 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613896 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613900 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613906 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613911 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613915 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613917 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613921 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.614498 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613924 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613927 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613930 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613933 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613935 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613938 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613941 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613943 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613946 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613948 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613950 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613953 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613955 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613958 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613960 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613963 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613965 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613968 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613970 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613973 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.614994 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613976 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613978 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613981 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613983 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613986 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613988 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613991 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613995 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.613998 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614002 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614004 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614006 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614009 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614011 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614016 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614019 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614022 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614025 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614028 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614030 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.615444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614033 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614035 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614432 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614437 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614440 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614443 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614446 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614448 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614451 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614455 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614458 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614462 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614464 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614467 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614470 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614473 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614476 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614479 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614481 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.615920 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614485 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614488 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614491 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614494 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614496 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614499 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614502 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614506 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614509 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614512 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614514 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614516 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614519 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614521 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614524 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614526 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614529 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614531 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614534 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.616406 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614537 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614539 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614541 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614544 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614546 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614548 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614551 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614555 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614558 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614560 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614562 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614565 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614568 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614571 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614574 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614577 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614579 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614581 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614584 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614586 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.616908 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614589 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614592 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614594 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614597 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614599 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614602 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614604 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614607 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614609 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614612 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614614 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614616 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614619 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614622 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614625 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614628 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614630 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614632 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614635 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614638 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.617369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614641 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614643 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614646 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614648 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614650 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614653 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614655 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614658 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614660 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.614663 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615533 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615543 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615551 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615555 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615560 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615564 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615569 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615573 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615576 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615579 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615583 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:25.617852 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615586 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615590 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615593 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615596 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615599 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615602 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615605 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615608 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615612 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615615 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615619 2576 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615622 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615625 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615629 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615632 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615635 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615638 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615641 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615644 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615647 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615650 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615652 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615657 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615660 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615662 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:25.618344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615665 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615669 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615672 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615677 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615680 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615682 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615686 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615688 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615692 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615695 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615698 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615701 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615704 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615707 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615709 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615712 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615715 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615718 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615721 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615725 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615728 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615731 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615734 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615737 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615741 2576 flags.go:64] FLAG: --help="false" Apr 22 14:15:25.618950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615744 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615747 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615750 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615752 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615756 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615759 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615762 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615765 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615767 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615771 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615773 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615777 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615779 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615782 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615785 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615788 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615791 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615794 2576 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615796 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615799 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615802 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615807 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615810 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615826 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:25.619521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615829 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615832 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615835 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615838 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615841 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615846 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615849 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615853 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615856 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615859 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615862 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615865 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615868 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615870 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615873 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615881 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615885 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615888 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615892 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615895 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615901 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615904 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615907 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615910 2576 flags.go:64] FLAG: --port="10250" Apr 22 14:15:25.620097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615913 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615917 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-061fa9a34bc1ec039" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615920 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615924 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615927 2576 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615930 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615933 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615937 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615940 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615943 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615946 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615950 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615953 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615956 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615958 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615961 2576 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615964 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615968 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615970 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615973 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615976 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615979 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615982 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615988 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615991 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615994 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:25.620663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.615997 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616000 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616003 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616006 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616009 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616014 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616017 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616020 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616024 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616027 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616029 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616035 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616038 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616041 2576 flags.go:64] FLAG: --v="2" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616046 2576 flags.go:64] FLAG: --version="false" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616050 2576 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616055 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.616058 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616877 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616883 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616886 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616890 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616894 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.621274 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616897 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616900 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616902 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616905 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616908 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616910 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616915 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616917 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616920 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616923 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616926 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616929 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616931 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616934 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616937 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616939 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616942 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616944 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616946 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.621807 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616949 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616954 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616958 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616961 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616964 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616967 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616969 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616972 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616975 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616978 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616980 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616983 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616986 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616988 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616990 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616993 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616995 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.616998 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617000 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.622318 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617004 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617006 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617009 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617012 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617014 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617017 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617020 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617022 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617025 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617027 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617030 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617032 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617035 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617037 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617041 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617043 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617046 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617048 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617051 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617053 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.622783 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617056 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617058 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617061 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617063 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617066 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617068 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617071 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617073 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617076 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617078 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617081 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617083 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617088 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617090 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617093 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617095 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617097 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617100 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617103 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617105 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.623277 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617108 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.623787 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617110 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.623787 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.617112 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.623787 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.617965 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.624906 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.624768 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:25.624939 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.624909 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:25.624971 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624958 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.624971 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624963 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.624971 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624967 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.624971 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624970 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624973 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624976 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624979 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624982 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624984 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624987 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624989 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624992 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624995 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.624997 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625000 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625003 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625006 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625009 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625012 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625014 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625017 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625019 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625023 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.625070 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625028 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625030 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625033 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625035 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625038 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625041 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625043 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625045 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625049 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625051 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625054 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625057 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625059 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625062 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625065 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625068 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625071 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625074 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625078 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625082 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.625535 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625084 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625087 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625090 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625093 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625096 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625099 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625102 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625105 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625107 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625110 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625112 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625115 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625117 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625120 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625122 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625125 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625128 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625130 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625133 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625135 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.626108 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625138 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625141 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625143 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625146 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625149 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625151 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625154 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625157 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625160 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625162 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625165 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625167 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625170 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625172 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625174 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625177 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625179 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625182 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625184 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625187 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.626586 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625189 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625191 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625194 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.625199 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625298 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625303 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625306 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625308 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625311 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625315 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625319 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625322 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625326 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625328 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625332 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.627122 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625335 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625337 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625340 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625342 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625345 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625348 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625350 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625353 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625356 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625358 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625361 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625363 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625366 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625368 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625371 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625373 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625376 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625379 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625381 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625384 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.627476 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625386 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625389 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625391 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625394 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625396 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625399 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625401 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625404 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625407 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625409 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625412 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625414 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625417 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625420 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625422 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625425 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625428 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625430 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625433 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625435 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.627961 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625438 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625440 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625443 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625445 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625448 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625450 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625452 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625455 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625458 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625460 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625463 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625465 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625467 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625470 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625472 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625475 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625477 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625479 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625483 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625486 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.628444 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625489 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625491 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625494 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625496 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625498 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625501 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625504 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625506 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625509 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625512 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625514 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625517 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625519 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625521 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:25.625523 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.628955 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.625528 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.629310 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.626225 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:25.629412 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.629399 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:25.630211 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.630199 2576 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:25.630328 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.630309 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:25.630392 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.630367 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:25.657740 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.657719 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:25.661985 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.661950 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:25.679875 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.679853 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:25.685999 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.685983 2576 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:25.687245 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.687221 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:25.690124 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.690103 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:25.692138 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.692111 2576 fs.go:135] Filesystem UUIDs: map[10e9e418-8705-4249-ba04-22a8e13a9994:/dev/nvme0n1p4 3dcb62c4-4572-4b21-925d-b5288781d522:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 14:15:25.692203 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.692138 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:25.698451 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.698342 2576 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:25.696483431 +0000 UTC m=+0.513263787 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200358 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec297ec4f01493752aa92d01282add96 SystemUUID:ec297ec4-f014-9375-2aa9-2d01282add96 BootID:7ea7ee61-1bdb-44ae-afe5-a6c572e1cd24 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ee:2a:38:51:87 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ee:2a:38:51:87 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:68:c9:b1:5b:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:25.699022 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.699011 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:25.699149 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.699136 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:25.700211 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.700185 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:25.700375 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.700214 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-83.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:25.700418 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.700384 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:25.700418 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.700392 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:25.700418 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.700405 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:25.701206 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.701196 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:25.702795 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.702780 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:25.702963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.702953 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:25.705533 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.705523 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:25.705571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.705538 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:25.705571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.705550 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:25.705571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.705560 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:25.705571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.705569 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:25.706663 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.706651 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:25.706708 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.706669 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:25.709557 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.709539 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:25.710980 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.710967 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:25.712775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712762 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712782 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712789 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712794 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712799 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712805 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:25.712811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712811 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:25.712977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712832 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:25.712977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712843 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:25.712977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712849 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:25.712977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712858 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:25.712977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.712867 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:25.713855 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.713845 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:25.713855 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.713855 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:25.717511 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.717489 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:25.717511 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.717490 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:25.717511 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.717496 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:25.717676 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.717527 2576 server.go:1295] "Started kubelet" Apr 22 14:15:25.717676 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.717528 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-83.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:25.717676 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.717608 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:25.717762 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.717656 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:25.719533 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.719510 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:25.720294 ip-10-0-139-83 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:25.720856 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.720833 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:25.722064 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.722002 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4lbsn" Apr 22 14:15:25.723113 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.722843 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:25.727034 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.726016 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-83.ec2.internal.18a8b36b2263d978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-83.ec2.internal,UID:ip-10-0-139-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-83.ec2.internal,},FirstTimestamp:2026-04-22 14:15:25.717502328 +0000 UTC m=+0.534282686,LastTimestamp:2026-04-22 14:15:25.717502328 +0000 UTC m=+0.534282686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-83.ec2.internal,}" Apr 22 14:15:25.729155 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729138 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:25.729228 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729156 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:25.729827 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729795 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:25.729916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729833 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:25.729916 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.729909 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:25.730018 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729935 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:25.730018 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729967 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:25.730018 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.729976 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:25.730277 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.730255 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:25.730277 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730258 2576 factory.go:153] Registering CRI-O factory Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730288 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730328 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730336 2576 factory.go:55] Registering systemd factory Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730341 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730358 2576 factory.go:103] Registering Raw factory Apr 22 14:15:25.730429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730366 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:25.730695 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.730656 2576 manager.go:319] Starting recovery of all containers Apr 22 14:15:25.731190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.731172 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4lbsn" Apr 22 14:15:25.740945 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.740928 2576 manager.go:324] Recovery completed Apr 22 14:15:25.741037 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.740986 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:25.743738 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.743720 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-83.ec2.internal\" not found" node="ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.745449 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.745437 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.747781 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.747767 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.747905 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.747799 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.747905 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.747840 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.748450 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.748438 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:25.748488 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.748450 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:25.748488 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.748467 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:25.750646 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.750635 2576 policy_none.go:49] "None policy: Start" Apr 22 14:15:25.750688 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.750650 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:25.750688 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.750660 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:25.796423 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796399 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.796470 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796485 2576 server.go:85] "Starting device plugin registration server" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796726 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796737 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796836 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796914 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.796922 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.797712 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:25.813360 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.797746 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:25.890539 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.890456 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:25.891677 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.891660 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:25.891738 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.891695 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:25.891738 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.891719 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:25.891738 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.891729 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:25.891945 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.891839 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:25.897518 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.897497 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.897630 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.897533 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:25.898410 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.898394 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.898478 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.898423 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.898478 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.898434 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.898478 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.898456 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.907529 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.907515 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.907572 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.907537 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-83.ec2.internal\": node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:25.923129 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:25.923109 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:25.991940 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.991911 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal"] Apr 22 14:15:25.992008 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.991990 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.992876 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.992862 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.992942 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.992888 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.992942 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.992898 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.994194 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994183 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.994342 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.994387 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994358 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.994881 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994859 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.994973 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994890 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.994973 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994900 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.994973 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994862 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.994973 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994954 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.994973 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.994968 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.996300 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.996287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:25.996339 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.996310 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.996939 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.996924 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.997011 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.996948 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.997011 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:25.996961 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:26.021445 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.021420 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-83.ec2.internal\" not found" node="ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.023315 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.023294 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.025765 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.025748 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-83.ec2.internal\" not found" node="ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.123485 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.123456 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.131775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.131752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.131859 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.131786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.131859 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.131805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4b55a5e2e0ee6a678340d55dba3cf653-config\") pod \"kube-apiserver-proxy-ip-10-0-139-83.ec2.internal\" (UID: \"4b55a5e2e0ee6a678340d55dba3cf653\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.223724 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.223662 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.232025 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.232085 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.232085 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4b55a5e2e0ee6a678340d55dba3cf653-config\") pod \"kube-apiserver-proxy-ip-10-0-139-83.ec2.internal\" (UID: \"4b55a5e2e0ee6a678340d55dba3cf653\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.232168 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4b55a5e2e0ee6a678340d55dba3cf653-config\") pod \"kube-apiserver-proxy-ip-10-0-139-83.ec2.internal\" (UID: \"4b55a5e2e0ee6a678340d55dba3cf653\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.232168 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.232168 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.232131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/949c3d68cacea17dc74f73d3f1da9031-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal\" (UID: \"949c3d68cacea17dc74f73d3f1da9031\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.324175 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.324141 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.324258 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.324188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.331270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.331249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.424694 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.424653 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.525304 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.525215 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.567897 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.567868 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:26.625494 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.625455 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-83.ec2.internal\" not found" Apr 22 14:15:26.626160 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.625668 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:26.630099 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.630084 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.631196 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631179 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:26.631303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631286 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 22 14:15:26.631356 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631290 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 22 14:15:26.631356 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631320 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 22 14:15:26.631356 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631324 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="an error on the server (\"unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close\") has prevented the request from succeeding" Apr 22 14:15:26.631356 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.631340 2576 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a56207b6c262c40bab53ed701914fc3a-dce92e9ce5c6f3fa.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": write tcp 10.0.139.83:42370->54.144.23.133:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.631471 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.631359 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" Apr 22 14:15:26.651667 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.651643 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:26.706308 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.706273 2576 apiserver.go:52] "Watching apiserver" Apr 22 14:15:26.713139 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.713116 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:26.714352 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.714252 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dngb2","openshift-network-diagnostics/network-check-target-xj22k","kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal","openshift-dns/node-resolver-lhmjg","openshift-image-registry/node-ca-9rd7w","openshift-network-operator/iptables-alerter-jrk58","openshift-ovn-kubernetes/ovnkube-node-pbtw8","kube-system/konnectivity-agent-g9d2z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz","openshift-cluster-node-tuning-operator/tuned-zflp5","openshift-multus/multus-additional-cni-plugins-lbhgd","openshift-multus/multus-pn6dm"] Apr 22 14:15:26.718161 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.718136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.718258 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.718211 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:26.719252 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.719231 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:26.719363 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.719346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.719425 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.719357 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:26.721659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.721637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.721956 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.721942 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:26.722040 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.721964 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.722040 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.722013 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.722165 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.722092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2nd57\"" Apr 22 14:15:26.722895 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.722877 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.723009 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.722982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.723853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.723836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xkp9b\"" Apr 22 14:15:26.724001 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.723990 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:26.724058 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.724028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:26.724307 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.724294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.725478 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.725453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.725890 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.725876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.726074 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.726055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:26.726196 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.726134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.726310 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.726295 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rxc75\"" Apr 22 14:15:26.726481 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.726466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.726867 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.726850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.728070 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.728053 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.728858 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.728837 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:26.728958 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.728928 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:26.729074 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729058 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.729177 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729124 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.729231 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.729278 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729242 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:26.729278 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729269 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:26.729696 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:26.729696 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hrf9n\"" Apr 22 14:15:26.729696 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.729856 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.729743 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76fxw\"" Apr 22 14:15:26.730596 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.730596 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730579 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.730744 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vwwdk\"" Apr 22 14:15:26.730744 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.730863 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fc8c7\"" Apr 22 14:15:26.730863 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730786 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:26.730863 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730804 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.731004 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730870 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:26.731004 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.730992 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:26.731225 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.731208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86fvw\"" Apr 22 14:15:26.731322 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.731246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:26.731403 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.731333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:26.731403 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.731358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:26.731506 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.731475 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:26.733172 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.733073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s27n5\"" Apr 22 14:15:26.733172 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.733082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:26.733172 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.733068 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:25 +0000 UTC" deadline="2028-01-29 10:53:00.137174042 +0000 UTC" Apr 22 14:15:26.733172 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.733141 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15524h37m33.404039746s" Apr 22 14:15:26.734430 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-registration-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.734503 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4h8\" (UniqueName: \"kubernetes.io/projected/eca26ba6-2d35-49df-a2f1-164475c0423f-kube-api-access-qp4h8\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.734503 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-env-overrides\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.734605 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.734605 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-device-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.734698 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.734746 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.734793 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-tmp\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.734868 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-cnibin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.734920 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-kubelet\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.734964 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrv9\" (UniqueName: \"kubernetes.io/projected/7d49b78a-27ae-4f41-a759-29b898bf6fe1-kube-api-access-jnrv9\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.734964 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f743df8-0701-4855-a2fa-4b71d8a6efc9-host-slash\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.735042 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.734977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-kubelet\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.735042 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-script-lib\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.735124 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.735124 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-modprobe-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.735209 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysconfig\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.735209 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-run\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.735290 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-k8s-cni-cncf-io\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.735290 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-multus-certs\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.735374 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1563b033-b9b8-425b-ab3d-4a3b05b42fec-agent-certs\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.735374 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-multus-daemon-config\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.735454 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xlh\" (UniqueName: \"kubernetes.io/projected/377748a7-900a-4086-b92d-5dcf4538b46f-kube-api-access-85xlh\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.735454 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-slash\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.735538 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-node-log\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.735591 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-log-socket\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.735661 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-config\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-sys\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-systemd-units\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-system-cni-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-cnibin\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpsf\" (UniqueName: \"kubernetes.io/projected/87e0334c-0350-4896-8f0a-f8f03953749a-kube-api-access-zxpsf\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxshl\" (UniqueName: \"kubernetes.io/projected/ace2c57e-f44e-4d5d-b3fc-b036816a748d-kube-api-access-bxshl\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.735988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-socket-dir-parent\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovn-node-metrics-cert\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.736108 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-hosts-file\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.736830 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-sys-fs\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.736923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-lib-modules\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.736923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-etc-kubernetes\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.736923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/377748a7-900a-4086-b92d-5dcf4538b46f-serviceca\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.737066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.737066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-conf-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.737066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f743df8-0701-4855-a2fa-4b71d8a6efc9-iptables-alerter-script\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.737066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.736999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377748a7-900a-4086-b92d-5dcf4538b46f-host\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.737066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-etc-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-os-release\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-netd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-kubernetes\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.737270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-netns\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-var-lib-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854hz\" (UniqueName: \"kubernetes.io/projected/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-kube-api-access-854hz\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-system-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-hostroot\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-bin\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-socket-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-bin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-systemd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwkg\" (UniqueName: \"kubernetes.io/projected/baba1b59-01cc-4a9a-8350-a118e41a4e8b-kube-api-access-8lwkg\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-os-release\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-conf\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-var-lib-kubelet\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.737632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-tuned\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-cni-binary-copy\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-multus\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-tmp-dir\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-systemd\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-netns\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmk4\" (UniqueName: \"kubernetes.io/projected/836b411b-66a6-4504-9937-fe987775439a-kube-api-access-ffmk4\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs7px\" (UniqueName: \"kubernetes.io/projected/2f743df8-0701-4855-a2fa-4b71d8a6efc9-kube-api-access-qs7px\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1563b033-b9b8-425b-ab3d-4a3b05b42fec-konnectivity-ca\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-ovn\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-host\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.738198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.737989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.740269 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.740251 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:26.766221 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.766202 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gcnmz" Apr 22 14:15:26.775223 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.775205 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gcnmz" Apr 22 14:15:26.838282 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/377748a7-900a-4086-b92d-5dcf4538b46f-serviceca\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.838464 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.838464 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-conf-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.838464 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f743df8-0701-4855-a2fa-4b71d8a6efc9-iptables-alerter-script\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-conf-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377748a7-900a-4086-b92d-5dcf4538b46f-host\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-etc-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-os-release\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377748a7-900a-4086-b92d-5dcf4538b46f-host\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.838617 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/377748a7-900a-4086-b92d-5dcf4538b46f-serviceca\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-netd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-etc-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-kubernetes\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-netd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-os-release\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-kubernetes\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-etc-selinux\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-netns\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-var-lib-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-854hz\" (UniqueName: \"kubernetes.io/projected/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-kube-api-access-854hz\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-system-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839000 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-var-lib-openvswitch\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-hostroot\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-bin\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-run-netns\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-system-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.838994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-socket-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-cni-bin\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-bin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-systemd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-hostroot\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwkg\" (UniqueName: \"kubernetes.io/projected/baba1b59-01cc-4a9a-8350-a118e41a4e8b-kube-api-access-8lwkg\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-socket-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-bin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-os-release\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-systemd\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:26.839730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f743df8-0701-4855-a2fa-4b71d8a6efc9-iptables-alerter-script\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-os-release\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-conf\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-var-lib-kubelet\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-tuned\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-var-lib-kubelet\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-cni-binary-copy\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-multus\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-conf\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-tmp-dir\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-systemd\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-netns\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmk4\" (UniqueName: \"kubernetes.io/projected/836b411b-66a6-4504-9937-fe987775439a-kube-api-access-ffmk4\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-cni-dir\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs7px\" (UniqueName: \"kubernetes.io/projected/2f743df8-0701-4855-a2fa-4b71d8a6efc9-kube-api-access-qs7px\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1563b033-b9b8-425b-ab3d-4a3b05b42fec-konnectivity-ca\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-systemd\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.840509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-ovn\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-host\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-registration-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-netns\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839681 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.839792 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.839895 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:27.339857264 +0000 UTC m=+2.156637619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-run-ovn\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-tmp-dir\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-host\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-cni-multus\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-registration-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.839695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4h8\" (UniqueName: \"kubernetes.io/projected/eca26ba6-2d35-49df-a2f1-164475c0423f-kube-api-access-qp4h8\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-env-overrides\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.841255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-cni-binary-copy\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-device-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-tmp\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-cnibin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-kubelet\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrv9\" (UniqueName: \"kubernetes.io/projected/7d49b78a-27ae-4f41-a759-29b898bf6fe1-kube-api-access-jnrv9\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f743df8-0701-4855-a2fa-4b71d8a6efc9-host-slash\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1563b033-b9b8-425b-ab3d-4a3b05b42fec-konnectivity-ca\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-kubelet\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-script-lib\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-modprobe-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysconfig\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-run\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842026 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-k8s-cni-cncf-io\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-multus-certs\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1563b033-b9b8-425b-ab3d-4a3b05b42fec-agent-certs\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-multus-daemon-config\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85xlh\" (UniqueName: \"kubernetes.io/projected/377748a7-900a-4086-b92d-5dcf4538b46f-kube-api-access-85xlh\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-slash\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-node-log\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysctl-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-log-socket\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-log-socket\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-config\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-sys\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-device-dir\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-systemd-units\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-system-cni-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-cnibin\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.842782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.840988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-cnibin\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpsf\" (UniqueName: \"kubernetes.io/projected/87e0334c-0350-4896-8f0a-f8f03953749a-kube-api-access-zxpsf\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxshl\" (UniqueName: \"kubernetes.io/projected/ace2c57e-f44e-4d5d-b3fc-b036816a748d-kube-api-access-bxshl\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-kubelet\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-socket-dir-parent\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovn-node-metrics-cert\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-hosts-file\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-sys-fs\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-lib-modules\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-etc-kubernetes\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-etc-kubernetes\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-binary-copy\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-script-lib\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-multus-certs\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f743df8-0701-4855-a2fa-4b71d8a6efc9-host-slash\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-env-overrides\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-var-lib-kubelet\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovnkube-config\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.843551 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-sys\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-systemd-units\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-multus-socket-dir-parent\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.841970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eca26ba6-2d35-49df-a2f1-164475c0423f-sys-fs\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-hosts-file\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-system-cni-dir\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-host-slash\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87e0334c-0350-4896-8f0a-f8f03953749a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e0334c-0350-4896-8f0a-f8f03953749a-cnibin\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/baba1b59-01cc-4a9a-8350-a118e41a4e8b-node-log\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-modprobe-d\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-lib-modules\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-sysconfig\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ace2c57e-f44e-4d5d-b3fc-b036816a748d-run\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/836b411b-66a6-4504-9937-fe987775439a-host-run-k8s-cni-cncf-io\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/836b411b-66a6-4504-9937-fe987775439a-multus-daemon-config\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.842696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-tmp\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.843102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ace2c57e-f44e-4d5d-b3fc-b036816a748d-etc-tuned\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.844774 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.844433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/baba1b59-01cc-4a9a-8350-a118e41a4e8b-ovn-node-metrics-cert\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.844774 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.844468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1563b033-b9b8-425b-ab3d-4a3b05b42fec-agent-certs\") pod \"konnectivity-agent-g9d2z\" (UID: \"1563b033-b9b8-425b-ab3d-4a3b05b42fec\") " pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:26.850317 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.849099 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:26.850317 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.849124 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:26.850317 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.849140 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:26.850317 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:26.849215 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:27.349189631 +0000 UTC m=+2.165969978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:26.852856 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.852266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-854hz\" (UniqueName: \"kubernetes.io/projected/7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba-kube-api-access-854hz\") pod \"node-resolver-lhmjg\" (UID: \"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba\") " pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:26.852856 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.852631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwkg\" (UniqueName: \"kubernetes.io/projected/baba1b59-01cc-4a9a-8350-a118e41a4e8b-kube-api-access-8lwkg\") pod \"ovnkube-node-pbtw8\" (UID: \"baba1b59-01cc-4a9a-8350-a118e41a4e8b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:26.854398 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.854379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxshl\" (UniqueName: \"kubernetes.io/projected/ace2c57e-f44e-4d5d-b3fc-b036816a748d-kube-api-access-bxshl\") pod \"tuned-zflp5\" (UID: \"ace2c57e-f44e-4d5d-b3fc-b036816a748d\") " pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:26.855923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.855905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs7px\" (UniqueName: \"kubernetes.io/projected/2f743df8-0701-4855-a2fa-4b71d8a6efc9-kube-api-access-qs7px\") pod \"iptables-alerter-jrk58\" (UID: \"2f743df8-0701-4855-a2fa-4b71d8a6efc9\") " pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:26.858738 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.858707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xlh\" (UniqueName: \"kubernetes.io/projected/377748a7-900a-4086-b92d-5dcf4538b46f-kube-api-access-85xlh\") pod \"node-ca-9rd7w\" (UID: \"377748a7-900a-4086-b92d-5dcf4538b46f\") " pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:26.861243 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.861225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmk4\" (UniqueName: \"kubernetes.io/projected/836b411b-66a6-4504-9937-fe987775439a-kube-api-access-ffmk4\") pod \"multus-pn6dm\" (UID: \"836b411b-66a6-4504-9937-fe987775439a\") " pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.861324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.861225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4h8\" (UniqueName: \"kubernetes.io/projected/eca26ba6-2d35-49df-a2f1-164475c0423f-kube-api-access-qp4h8\") pod \"aws-ebs-csi-driver-node-fg7jz\" (UID: \"eca26ba6-2d35-49df-a2f1-164475c0423f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:26.862320 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.862300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrv9\" (UniqueName: \"kubernetes.io/projected/7d49b78a-27ae-4f41-a759-29b898bf6fe1-kube-api-access-jnrv9\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:26.862607 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.862591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpsf\" (UniqueName: \"kubernetes.io/projected/87e0334c-0350-4896-8f0a-f8f03953749a-kube-api-access-zxpsf\") pod \"multus-additional-cni-plugins-lbhgd\" (UID: \"87e0334c-0350-4896-8f0a-f8f03953749a\") " pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.870605 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.870587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" Apr 22 14:15:26.877586 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.877572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pn6dm" Apr 22 14:15:26.937711 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:26.937674 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b55a5e2e0ee6a678340d55dba3cf653.slice/crio-c393bde836fd76d3129fa9b67056eaa7fec3f494b41c663c7d00d24c368c6853 WatchSource:0}: Error finding container c393bde836fd76d3129fa9b67056eaa7fec3f494b41c663c7d00d24c368c6853: Status 404 returned error can't find the container with id c393bde836fd76d3129fa9b67056eaa7fec3f494b41c663c7d00d24c368c6853 Apr 22 14:15:26.938218 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:26.938193 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod949c3d68cacea17dc74f73d3f1da9031.slice/crio-d9e783805f8fbe77a0cd9e64f429e2d285c4cbb055fdd524a717e9b38d2a375e WatchSource:0}: Error finding container d9e783805f8fbe77a0cd9e64f429e2d285c4cbb055fdd524a717e9b38d2a375e: Status 404 returned error can't find the container with id d9e783805f8fbe77a0cd9e64f429e2d285c4cbb055fdd524a717e9b38d2a375e Apr 22 14:15:26.942083 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:26.942056 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:27.046037 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.045938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jrk58" Apr 22 14:15:27.051594 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.051563 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f743df8_0701_4855_a2fa_4b71d8a6efc9.slice/crio-912461b33322dda9cf4b3fb30bbb24c705cda2a38c8c63186f3400ad5e1bb3df WatchSource:0}: Error finding container 912461b33322dda9cf4b3fb30bbb24c705cda2a38c8c63186f3400ad5e1bb3df: Status 404 returned error can't find the container with id 912461b33322dda9cf4b3fb30bbb24c705cda2a38c8c63186f3400ad5e1bb3df Apr 22 14:15:27.059611 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.059592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:27.065721 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.065700 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1563b033_b9b8_425b_ab3d_4a3b05b42fec.slice/crio-322dd204ed7b57b188cf5170148da34df708d9a48d476f8929ed7035e1a6a4f1 WatchSource:0}: Error finding container 322dd204ed7b57b188cf5170148da34df708d9a48d476f8929ed7035e1a6a4f1: Status 404 returned error can't find the container with id 322dd204ed7b57b188cf5170148da34df708d9a48d476f8929ed7035e1a6a4f1 Apr 22 14:15:27.074381 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.074365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lhmjg" Apr 22 14:15:27.081064 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.081037 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd35d56_d453_4a46_9d90_e6d7f3ddd0ba.slice/crio-a2ffff9e8a3691d2aa7fc7608acf35216e4120cb2aafeb99b501c92cd5094704 WatchSource:0}: Error finding container a2ffff9e8a3691d2aa7fc7608acf35216e4120cb2aafeb99b501c92cd5094704: Status 404 returned error can't find the container with id a2ffff9e8a3691d2aa7fc7608acf35216e4120cb2aafeb99b501c92cd5094704 Apr 22 14:15:27.089599 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.089583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9rd7w" Apr 22 14:15:27.096078 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.096035 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod377748a7_900a_4086_b92d_5dcf4538b46f.slice/crio-b8d85815afe4ac110a14480564163576596cc83c9651beb9600d82a54c5a8029 WatchSource:0}: Error finding container b8d85815afe4ac110a14480564163576596cc83c9651beb9600d82a54c5a8029: Status 404 returned error can't find the container with id b8d85815afe4ac110a14480564163576596cc83c9651beb9600d82a54c5a8029 Apr 22 14:15:27.105195 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.105177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:27.111271 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.111247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaba1b59_01cc_4a9a_8350_a118e41a4e8b.slice/crio-f3ed744d185b457ea55b3f29f5f052e091bebcca2c09d1cf0b6364f6f264ca6e WatchSource:0}: Error finding container f3ed744d185b457ea55b3f29f5f052e091bebcca2c09d1cf0b6364f6f264ca6e: Status 404 returned error can't find the container with id f3ed744d185b457ea55b3f29f5f052e091bebcca2c09d1cf0b6364f6f264ca6e Apr 22 14:15:27.122480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.122461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" Apr 22 14:15:27.132779 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.132758 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeca26ba6_2d35_49df_a2f1_164475c0423f.slice/crio-d4ea384e2eb7a231cb41f72ed01bb7afae615992c63eed0e5c309bc5a459d5d3 WatchSource:0}: Error finding container d4ea384e2eb7a231cb41f72ed01bb7afae615992c63eed0e5c309bc5a459d5d3: Status 404 returned error can't find the container with id d4ea384e2eb7a231cb41f72ed01bb7afae615992c63eed0e5c309bc5a459d5d3 Apr 22 14:15:27.139349 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.139327 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e0334c_0350_4896_8f0a_f8f03953749a.slice/crio-77dac228e483ed6ba06a0e5d913d8fd65efa60ef2a1603e48d96960665c4d628 WatchSource:0}: Error finding container 77dac228e483ed6ba06a0e5d913d8fd65efa60ef2a1603e48d96960665c4d628: Status 404 returned error can't find the container with id 77dac228e483ed6ba06a0e5d913d8fd65efa60ef2a1603e48d96960665c4d628 Apr 22 14:15:27.139977 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.139959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zflp5" Apr 22 14:15:27.145487 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.145465 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podace2c57e_f44e_4d5d_b3fc_b036816a748d.slice/crio-bdba2bb5f380610dc4ead2b1de163e721ce4046a7a9928f95398d070e2ab27fb WatchSource:0}: Error finding container bdba2bb5f380610dc4ead2b1de163e721ce4046a7a9928f95398d070e2ab27fb: Status 404 returned error can't find the container with id bdba2bb5f380610dc4ead2b1de163e721ce4046a7a9928f95398d070e2ab27fb Apr 22 14:15:27.198381 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:15:27.198351 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod836b411b_66a6_4504_9937_fe987775439a.slice/crio-a87e44162dee40f842ff4212adeb9f7ecdb24f48bde6333837a5b6b0573238f3 WatchSource:0}: Error finding container a87e44162dee40f842ff4212adeb9f7ecdb24f48bde6333837a5b6b0573238f3: Status 404 returned error can't find the container with id a87e44162dee40f842ff4212adeb9f7ecdb24f48bde6333837a5b6b0573238f3 Apr 22 14:15:27.344528 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.344493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:27.344704 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.344669 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:27.344760 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.344732 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.344711582 +0000 UTC m=+3.161491946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:27.445080 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.445044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:27.445280 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.445189 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:27.445280 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.445206 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:27.445280 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.445218 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:27.445280 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:27.445272 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.445253545 +0000 UTC m=+3.262033902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:27.777893 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.777669 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:26 +0000 UTC" deadline="2027-09-30 01:34:42.162338074 +0000 UTC" Apr 22 14:15:27.777893 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.777709 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12611h19m14.384633438s" Apr 22 14:15:27.904923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.904800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" event={"ID":"4b55a5e2e0ee6a678340d55dba3cf653","Type":"ContainerStarted","Data":"c393bde836fd76d3129fa9b67056eaa7fec3f494b41c663c7d00d24c368c6853"} Apr 22 14:15:27.911002 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.910959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zflp5" event={"ID":"ace2c57e-f44e-4d5d-b3fc-b036816a748d","Type":"ContainerStarted","Data":"bdba2bb5f380610dc4ead2b1de163e721ce4046a7a9928f95398d070e2ab27fb"} Apr 22 14:15:27.920339 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.920268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" event={"ID":"eca26ba6-2d35-49df-a2f1-164475c0423f","Type":"ContainerStarted","Data":"d4ea384e2eb7a231cb41f72ed01bb7afae615992c63eed0e5c309bc5a459d5d3"} Apr 22 14:15:27.925293 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.925263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"f3ed744d185b457ea55b3f29f5f052e091bebcca2c09d1cf0b6364f6f264ca6e"} Apr 22 14:15:27.933689 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.933670 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:27.933980 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.933951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9d2z" event={"ID":"1563b033-b9b8-425b-ab3d-4a3b05b42fec","Type":"ContainerStarted","Data":"322dd204ed7b57b188cf5170148da34df708d9a48d476f8929ed7035e1a6a4f1"} Apr 22 14:15:27.945743 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.945717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jrk58" event={"ID":"2f743df8-0701-4855-a2fa-4b71d8a6efc9","Type":"ContainerStarted","Data":"912461b33322dda9cf4b3fb30bbb24c705cda2a38c8c63186f3400ad5e1bb3df"} Apr 22 14:15:27.956772 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.956741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" event={"ID":"949c3d68cacea17dc74f73d3f1da9031","Type":"ContainerStarted","Data":"d9e783805f8fbe77a0cd9e64f429e2d285c4cbb055fdd524a717e9b38d2a375e"} Apr 22 14:15:27.969455 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.969413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pn6dm" event={"ID":"836b411b-66a6-4504-9937-fe987775439a","Type":"ContainerStarted","Data":"a87e44162dee40f842ff4212adeb9f7ecdb24f48bde6333837a5b6b0573238f3"} Apr 22 14:15:27.978135 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.978113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerStarted","Data":"77dac228e483ed6ba06a0e5d913d8fd65efa60ef2a1603e48d96960665c4d628"} Apr 22 14:15:28.000655 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.997084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9rd7w" event={"ID":"377748a7-900a-4086-b92d-5dcf4538b46f","Type":"ContainerStarted","Data":"b8d85815afe4ac110a14480564163576596cc83c9651beb9600d82a54c5a8029"} Apr 22 14:15:28.000655 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:27.999140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lhmjg" event={"ID":"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba","Type":"ContainerStarted","Data":"a2ffff9e8a3691d2aa7fc7608acf35216e4120cb2aafeb99b501c92cd5094704"} Apr 22 14:15:28.041552 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.041474 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:28.352659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.352100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:28.352659 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.352241 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.352659 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.352305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.35228483 +0000 UTC m=+5.169065173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.453118 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.453071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:28.453773 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.453328 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:28.453773 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.453349 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:28.453773 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.453363 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.453773 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.453422 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:30.45340244 +0000 UTC m=+5.270182786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.778338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.778225 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:26 +0000 UTC" deadline="2028-01-01 14:18:12.097979298 +0000 UTC" Apr 22 14:15:28.778338 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.778265 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14856h2m43.31971803s" Apr 22 14:15:28.892152 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.892114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:28.892337 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.892249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:28.892714 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:28.892693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:28.892844 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:28.892801 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:29.207845 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:29.207799 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:29.452889 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:29.452856 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:30.369410 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.369377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:30.369866 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.369533 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:30.369866 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.369603 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:34.369584823 +0000 UTC m=+9.186365168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:30.459686 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.458978 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7bgl8"] Apr 22 14:15:30.461248 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.460918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.461248 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.460998 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:30.470167 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.470138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:30.470316 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.470300 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:30.470372 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.470325 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:30.470372 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.470360 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:30.470476 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.470416 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:34.470399757 +0000 UTC m=+9.287180103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:30.570896 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.570858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.571057 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.570915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-dbus\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.571057 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.570947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-kubelet-config\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672211 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.672097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-dbus\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672211 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.672147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-kubelet-config\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672440 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.672243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672440 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.672246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-dbus\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672440 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.672247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/274b8db4-5e01-406a-b732-06e1a0f63ab2-kubelet-config\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:30.672440 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.672363 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:30.672440 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.672426 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.172408097 +0000 UTC m=+5.989188450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:30.892755 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.892721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:30.892931 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:30.892743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:30.892931 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.892881 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:30.893049 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:30.892959 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:31.176277 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:31.176169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:31.176465 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:31.176373 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:31.176465 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:31.176442 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:32.176420617 +0000 UTC m=+6.993200965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:31.892875 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:31.892843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:31.893314 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:31.892974 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:32.183578 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:32.183499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:32.183723 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:32.183672 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:32.183782 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:32.183755 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:34.183735421 +0000 UTC m=+9.000515769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:32.893587 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:32.892884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:32.893587 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:32.893007 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:32.893587 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:32.893436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:32.893587 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:32.893538 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:33.892811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:33.892777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:33.893011 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:33.892937 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:34.203550 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:34.203436 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:34.203974 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.203597 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:34.203974 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.203680 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:38.203659449 +0000 UTC m=+13.020439814 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:34.405711 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:34.405542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:34.405884 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.405745 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:34.405884 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.405811 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:42.405792162 +0000 UTC m=+17.222572505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:34.506733 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:34.506639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:34.506918 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.506875 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:34.506918 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.506893 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:34.506918 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.506907 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:34.507080 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.506966 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:42.506948009 +0000 UTC m=+17.323728369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:34.892167 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:34.892129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:34.892362 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.892270 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:34.892703 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:34.892683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:34.892812 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:34.892787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:35.893057 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:35.893020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:35.893467 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:35.893141 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:36.892381 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:36.892339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:36.892571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:36.892339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:36.892571 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:36.892483 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:36.892676 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:36.892624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:37.892683 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:37.892649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:37.893182 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:37.892763 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:38.232763 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:38.232668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:38.232944 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:38.232773 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:38.232944 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:38.232839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:46.232810057 +0000 UTC m=+21.049590405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:38.892927 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:38.892889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:38.892927 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:38.892914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:38.893419 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:38.893051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:38.893419 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:38.893175 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:39.895132 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:39.895101 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:39.895504 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:39.895212 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:40.892788 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:40.892751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:40.892986 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:40.892751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:40.892986 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:40.892891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:40.892986 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:40.892932 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:41.892963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:41.892933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:41.893408 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:41.893047 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:42.463465 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:42.463431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:42.463640 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.463549 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:42.463640 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.463615 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.463595696 +0000 UTC m=+33.280376047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:42.564065 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:42.564028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:42.564245 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.564192 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:42.564245 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.564226 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:42.564245 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.564240 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:42.564347 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.564298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.564280721 +0000 UTC m=+33.381061067 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:42.892499 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:42.892463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:42.892694 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:42.892463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:42.892694 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.892568 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:42.892694 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:42.892673 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:43.892479 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:43.892444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:43.892944 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:43.892559 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:44.892109 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:44.892035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:44.892255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:44.892036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:44.892255 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:44.892141 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:44.892255 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:44.892222 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:45.893573 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:45.893545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:45.894260 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:45.893665 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:46.038675 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.038639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zflp5" event={"ID":"ace2c57e-f44e-4d5d-b3fc-b036816a748d","Type":"ContainerStarted","Data":"c35d71a71798dfe333d87ba28c725fa6f2227144f1f92d6246025c2686058e5c"} Apr 22 14:15:46.039790 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.039764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" event={"ID":"eca26ba6-2d35-49df-a2f1-164475c0423f","Type":"ContainerStarted","Data":"4350dadfbb3b4c486bafca9a799eb059ad7f0ada394db6fadea3081e9be4044e"} Apr 22 14:15:46.044161 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"8c6630088461709609d4adfb5027bcb68ec4692b0064ac15a89e40265ec8e582"} Apr 22 14:15:46.044241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"ffa265ef742a34cd5c2980e3868fe911ce2b97dbe212a46849ee3adf0efb956c"} Apr 22 14:15:46.044241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"035bb247e14a9680977d9a8431303632e648971c8c6ffdc0f2c1b8f767af4d29"} Apr 22 14:15:46.044241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"99c5c8b7e1d7fde02e7cfd9cf340b680ee72a6097e2e54e07f79d66729ba49e4"} Apr 22 14:15:46.044241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"272ac05b71a34acf3f2721a4341733b186c8e6d6f6ec6f4bc5f6f17b45a2f9a8"} Apr 22 14:15:46.044241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.044208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"96678f5dad70641650839e871a88faf8310c690a5ef2f89889645391154d3bc3"} Apr 22 14:15:46.045296 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.045278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9d2z" event={"ID":"1563b033-b9b8-425b-ab3d-4a3b05b42fec","Type":"ContainerStarted","Data":"c7b1b8b501d12957516e181a8a7b3ed6bbc11af399002e4624c3abf2f91f39d3"} Apr 22 14:15:46.046466 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.046446 2576 generic.go:358] "Generic (PLEG): container finished" podID="949c3d68cacea17dc74f73d3f1da9031" containerID="74556a4af0280b413b0fbcceaf1104d85f6d47490fb5cb6936f6cf47fec4dea9" exitCode=0 Apr 22 14:15:46.046533 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.046509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" event={"ID":"949c3d68cacea17dc74f73d3f1da9031","Type":"ContainerDied","Data":"74556a4af0280b413b0fbcceaf1104d85f6d47490fb5cb6936f6cf47fec4dea9"} Apr 22 14:15:46.046652 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.046631 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" Apr 22 14:15:46.047742 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.047721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pn6dm" event={"ID":"836b411b-66a6-4504-9937-fe987775439a","Type":"ContainerStarted","Data":"4c0bac5a5c8d51db9584cfd64b5360928244b2a4343b3bdccf7ad0c10223548a"} Apr 22 14:15:46.048924 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.048903 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="8e70772e939457ce216d0d4e71faee870b30a54a720ed2cced6d38a1185872b7" exitCode=0 Apr 22 14:15:46.049021 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.048963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"8e70772e939457ce216d0d4e71faee870b30a54a720ed2cced6d38a1185872b7"} Apr 22 14:15:46.050208 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.050184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9rd7w" event={"ID":"377748a7-900a-4086-b92d-5dcf4538b46f","Type":"ContainerStarted","Data":"d1b95d42ed9f4ff9d3609a218c171190d050da9769866600e47c6bfe5e8e051a"} Apr 22 14:15:46.051518 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.051502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lhmjg" event={"ID":"7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba","Type":"ContainerStarted","Data":"7d38029e136c071835f700590a85490ebe4e9e5ff9c85360ce29039dcce30856"} Apr 22 14:15:46.052763 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.052746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" event={"ID":"4b55a5e2e0ee6a678340d55dba3cf653","Type":"ContainerStarted","Data":"1c9cb31d63a5482f0324d39f84322753bdff636eba03de87c9211cc4ae6e336e"} Apr 22 14:15:46.057626 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.057612 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:46.058300 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.058285 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal"] Apr 22 14:15:46.062163 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.062119 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zflp5" podStartSLOduration=2.557390792 podStartE2EDuration="20.062109027s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.146662152 +0000 UTC m=+1.963442495" lastFinishedPulling="2026-04-22 14:15:44.651380381 +0000 UTC m=+19.468160730" observedRunningTime="2026-04-22 14:15:46.061711022 +0000 UTC m=+20.878491387" watchObservedRunningTime="2026-04-22 14:15:46.062109027 +0000 UTC m=+20.878889391" Apr 22 14:15:46.081775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.081726 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pn6dm" podStartSLOduration=2.59932983 podStartE2EDuration="20.081712936s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.200221462 +0000 UTC m=+2.017001805" lastFinishedPulling="2026-04-22 14:15:44.682604554 +0000 UTC m=+19.499384911" observedRunningTime="2026-04-22 14:15:46.081467646 +0000 UTC m=+20.898248012" watchObservedRunningTime="2026-04-22 14:15:46.081712936 +0000 UTC m=+20.898493300" Apr 22 14:15:46.133398 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.133351 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lhmjg" podStartSLOduration=2.566185205 podStartE2EDuration="20.133335822s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.082417956 +0000 UTC m=+1.899198299" lastFinishedPulling="2026-04-22 14:15:44.649568567 +0000 UTC m=+19.466348916" observedRunningTime="2026-04-22 14:15:46.114116074 +0000 UTC m=+20.930896430" watchObservedRunningTime="2026-04-22 14:15:46.133335822 +0000 UTC m=+20.950116187" Apr 22 14:15:46.153909 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.153862 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-83.ec2.internal" podStartSLOduration=20.153844405 podStartE2EDuration="20.153844405s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:46.153638318 +0000 UTC m=+20.970418675" watchObservedRunningTime="2026-04-22 14:15:46.153844405 +0000 UTC m=+20.970624771" Apr 22 14:15:46.154024 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.153997 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9rd7w" podStartSLOduration=2.601492522 podStartE2EDuration="20.153988953s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.097378417 +0000 UTC m=+1.914158760" lastFinishedPulling="2026-04-22 14:15:44.649874825 +0000 UTC m=+19.466655191" observedRunningTime="2026-04-22 14:15:46.133461699 +0000 UTC m=+20.950242047" watchObservedRunningTime="2026-04-22 14:15:46.153988953 +0000 UTC m=+20.970769377" Apr 22 14:15:46.174258 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.174208 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g9d2z" podStartSLOduration=3.591597372 podStartE2EDuration="21.174194845s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.067288231 +0000 UTC m=+1.884068574" lastFinishedPulling="2026-04-22 14:15:44.649885703 +0000 UTC m=+19.466666047" observedRunningTime="2026-04-22 14:15:46.174054412 +0000 UTC m=+20.990834776" watchObservedRunningTime="2026-04-22 14:15:46.174194845 +0000 UTC m=+20.990975209" Apr 22 14:15:46.296401 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.296365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:46.296506 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:46.296491 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:46.296577 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:46.296566 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret podName:274b8db4-5e01-406a-b732-06e1a0f63ab2 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:02.296544734 +0000 UTC m=+37.113325091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret") pod "global-pull-secret-syncer-7bgl8" (UID: "274b8db4-5e01-406a-b732-06e1a0f63ab2") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:46.797728 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.797537 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:46.810441 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.810346 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:46.797725085Z","UUID":"fe865bea-9c6e-415d-bd4d-422f489d9b54","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:46.812054 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.812035 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:46.812168 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.812111 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:46.892884 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.892853 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:46.893061 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:46.892964 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:46.893061 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:46.892854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:46.893180 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:46.893145 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:47.056678 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.056601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" event={"ID":"eca26ba6-2d35-49df-a2f1-164475c0423f","Type":"ContainerStarted","Data":"7d0d247d218fcae25b3288d27e0010e7eee74ff094654e852bc374e1ef4e4fb8"} Apr 22 14:15:47.059644 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.059615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jrk58" event={"ID":"2f743df8-0701-4855-a2fa-4b71d8a6efc9","Type":"ContainerStarted","Data":"184c1ad1f163cde08d6abb22b8a570e4d87bd7e44b17a4e660846bb920f6344d"} Apr 22 14:15:47.061889 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.061781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" event={"ID":"949c3d68cacea17dc74f73d3f1da9031","Type":"ContainerStarted","Data":"212f32fafbd33c267f507bf4c860843e4db8faa19b7da05ea1e8af5aade008ce"} Apr 22 14:15:47.076097 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.076055 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jrk58" podStartSLOduration=4.479702367 podStartE2EDuration="22.076041918s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.053135824 +0000 UTC m=+1.869916167" lastFinishedPulling="2026-04-22 14:15:44.649475361 +0000 UTC m=+19.466255718" observedRunningTime="2026-04-22 14:15:47.075865593 +0000 UTC m=+21.892645971" watchObservedRunningTime="2026-04-22 14:15:47.076041918 +0000 UTC m=+21.892822283" Apr 22 14:15:47.093066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.093018 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-83.ec2.internal" podStartSLOduration=1.093006794 podStartE2EDuration="1.093006794s" podCreationTimestamp="2026-04-22 14:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:47.092652771 +0000 UTC m=+21.909433137" watchObservedRunningTime="2026-04-22 14:15:47.093006794 +0000 UTC m=+21.909787158" Apr 22 14:15:47.896455 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:47.893160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:47.896455 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:47.893319 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:48.067556 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:48.067479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" event={"ID":"eca26ba6-2d35-49df-a2f1-164475c0423f","Type":"ContainerStarted","Data":"af95a92dd130156f6def609b80248c63849edbe188d9b78e2c20c8b63876c829"} Apr 22 14:15:48.071710 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:48.071660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"3d627a8daa831fdaa330d75fe43d926581171f9cd7be961901f806cf195d4bfb"} Apr 22 14:15:48.086941 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:48.086899 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fg7jz" podStartSLOduration=1.380588926 podStartE2EDuration="22.08688721s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.134228807 +0000 UTC m=+1.951009150" lastFinishedPulling="2026-04-22 14:15:47.840527091 +0000 UTC m=+22.657307434" observedRunningTime="2026-04-22 14:15:48.086781653 +0000 UTC m=+22.903562017" watchObservedRunningTime="2026-04-22 14:15:48.08688721 +0000 UTC m=+22.903667575" Apr 22 14:15:48.892351 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:48.892321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:48.892351 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:48.892358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:48.892534 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:48.892448 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:48.892597 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:48.892571 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:49.892976 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:49.892943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:49.893744 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:49.893056 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:50.544283 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:50.544034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:50.544591 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:50.544575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:50.892675 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:50.892637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:50.892839 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:50.892642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:50.892839 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:50.892757 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:50.892927 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:50.892794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:51.078920 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.078880 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="66c126c05dbb4614988e485d40c60f0f8cb8e5452d0d8e8faaf6a6e5ce947495" exitCode=0 Apr 22 14:15:51.079334 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.078977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"66c126c05dbb4614988e485d40c60f0f8cb8e5452d0d8e8faaf6a6e5ce947495"} Apr 22 14:15:51.082198 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.082174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" event={"ID":"baba1b59-01cc-4a9a-8350-a118e41a4e8b","Type":"ContainerStarted","Data":"6947d215201f92bd3fc9df1da1c69073901053a3935f79c54b5714d8d74cabb4"} Apr 22 14:15:51.082478 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.082463 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:51.082549 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.082495 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:51.082549 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.082507 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:51.096518 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.096496 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:51.096639 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.096604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:15:51.892236 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.892020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:51.892390 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:51.892265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:51.993882 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.993832 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" podStartSLOduration=7.756766452 podStartE2EDuration="25.993800024s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.112734116 +0000 UTC m=+1.929514459" lastFinishedPulling="2026-04-22 14:15:45.349767685 +0000 UTC m=+20.166548031" observedRunningTime="2026-04-22 14:15:51.152503849 +0000 UTC m=+25.969284214" watchObservedRunningTime="2026-04-22 14:15:51.993800024 +0000 UTC m=+26.810580388" Apr 22 14:15:51.994177 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.994154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7bgl8"] Apr 22 14:15:51.997453 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.997431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xj22k"] Apr 22 14:15:51.997582 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.997571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:51.997689 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:51.997668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:51.998076 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.998055 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dngb2"] Apr 22 14:15:51.998180 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:51.998168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:51.998303 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:51.998270 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:52.085982 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:52.085949 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="dcad38151b6c1ccc838a624f2687e7a5b26690a100b63772ee8c6ac04216515f" exitCode=0 Apr 22 14:15:52.086552 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:52.086036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"dcad38151b6c1ccc838a624f2687e7a5b26690a100b63772ee8c6ac04216515f"} Apr 22 14:15:52.086552 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:52.086049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:52.086552 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:52.086370 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:53.090352 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:53.090323 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="107727ccda8815866bd72ede511071f0ed77ef263d89b178770d5719888b2703" exitCode=0 Apr 22 14:15:53.090777 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:53.090408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"107727ccda8815866bd72ede511071f0ed77ef263d89b178770d5719888b2703"} Apr 22 14:15:53.892923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:53.892887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:53.892923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:53.892920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:53.893211 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:53.892904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:53.893211 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:53.893010 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:53.893211 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:53.893054 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:53.893211 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:53.893159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:55.893911 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:55.893870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:55.894601 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:55.893972 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:55.894601 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:55.893982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:55.894601 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:55.894005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:55.894601 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:55.894064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:55.894601 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:55.894109 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:56.889607 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:56.889564 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:56.889793 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:56.889741 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:56.891046 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:56.891017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g9d2z" Apr 22 14:15:57.892297 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:57.892257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:57.892749 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:57.892380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:57.892749 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:57.892398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:57.892749 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:57.892374 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xj22k" podUID="0f845cc3-634e-4134-8f72-6e6eb367d773" Apr 22 14:15:57.892749 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:57.892508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:15:57.892749 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:57.892596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7bgl8" podUID="274b8db4-5e01-406a-b732-06e1a0f63ab2" Apr 22 14:15:58.031528 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.031503 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-83.ec2.internal" event="NodeReady" Apr 22 14:15:58.031699 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.031654 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:58.094470 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.094425 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kcjkd"] Apr 22 14:15:58.121833 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.121783 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tr25t"] Apr 22 14:15:58.121978 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.121967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.126569 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.126544 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:58.126718 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.126542 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:58.126887 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.126864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:58.127161 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.127128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:15:58.145555 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.145532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tr25t"] Apr 22 14:15:58.145555 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.145558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kcjkd"] Apr 22 14:15:58.145768 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.145676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.149559 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.149536 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:58.149687 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.149583 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:58.149892 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.149878 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:15:58.292138 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-config-volume\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.292324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.292324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-tmp-dir\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.292324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2hk\" (UniqueName: \"kubernetes.io/projected/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-kube-api-access-wz2hk\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.292324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.292324 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.292323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk8h\" (UniqueName: \"kubernetes.io/projected/faa0ae94-53cb-46ba-af35-0e690ed5b286-kube-api-access-sbk8h\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.393063 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.392970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-tmp-dir\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.393063 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2hk\" (UniqueName: \"kubernetes.io/projected/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-kube-api-access-wz2hk\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.393063 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.393063 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk8h\" (UniqueName: \"kubernetes.io/projected/faa0ae94-53cb-46ba-af35-0e690ed5b286-kube-api-access-sbk8h\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.393327 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-config-volume\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.393327 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.393327 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.393243 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:58.393327 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.393297 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.893278231 +0000 UTC m=+33.710058588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:58.393526 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-tmp-dir\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.393526 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.393378 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:58.393526 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.393429 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.893413117 +0000 UTC m=+33.710193473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:15:58.393805 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.393780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-config-volume\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.412597 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.412455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2hk\" (UniqueName: \"kubernetes.io/projected/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-kube-api-access-wz2hk\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.412748 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.412669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk8h\" (UniqueName: \"kubernetes.io/projected/faa0ae94-53cb-46ba-af35-0e690ed5b286-kube-api-access-sbk8h\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.493949 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.493910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:58.494158 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.494074 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:58.494158 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.494153 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.494136872 +0000 UTC m=+65.310917216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:58.594611 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.594559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:58.594801 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.594764 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:58.594801 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.594788 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:58.594801 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.594801 2576 projected.go:194] Error preparing data for projected volume kube-api-access-v5h5r for pod openshift-network-diagnostics/network-check-target-xj22k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:58.594984 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.594890 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r podName:0f845cc3-634e-4134-8f72-6e6eb367d773 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.594869928 +0000 UTC m=+65.411650275 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-v5h5r" (UniqueName: "kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r") pod "network-check-target-xj22k" (UID: "0f845cc3-634e-4134-8f72-6e6eb367d773") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:58.896716 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.896652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:58.897176 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:58.896812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:58.897176 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.897116 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:58.897176 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.897177 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:58.897313 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.897283 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.897221631 +0000 UTC m=+34.714001997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:15:58.897367 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:58.897323 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.897311293 +0000 UTC m=+34.714091640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:15:59.892570 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.892529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:15:59.892883 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.892530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:15:59.892883 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.892530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:15:59.895591 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.895570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:59.895784 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.895767 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:59.895875 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.895838 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:15:59.896914 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.896893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jjvkf\"" Apr 22 14:15:59.897188 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.897078 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:15:59.897188 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.897164 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:59.905052 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.905030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:15:59.905170 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:59.905148 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:59.905231 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:15:59.905182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:15:59.905231 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:59.905206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.905188099 +0000 UTC m=+36.721968449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:15:59.905339 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:59.905255 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:59.905339 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:15:59.905306 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.9052937 +0000 UTC m=+36.722074060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:00.105751 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:00.105721 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="dc1c2617441841a8fca219591e9dfc9fc1cf86282f30e787947c95c79cc5e4a1" exitCode=0 Apr 22 14:16:00.105751 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:00.105756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"dc1c2617441841a8fca219591e9dfc9fc1cf86282f30e787947c95c79cc5e4a1"} Apr 22 14:16:01.110498 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:01.110467 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e0334c-0350-4896-8f0a-f8f03953749a" containerID="1185752735af8e33657d045fdc00b6db5050f8730b7dcfba6512be403607fbe1" exitCode=0 Apr 22 14:16:01.110861 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:01.110525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerDied","Data":"1185752735af8e33657d045fdc00b6db5050f8730b7dcfba6512be403607fbe1"} Apr 22 14:16:01.920528 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:01.920499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:16:01.920686 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:01.920541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:16:01.920686 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:01.920632 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:01.920686 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:01.920637 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:01.920686 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:01.920677 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.920664442 +0000 UTC m=+40.737444785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:01.920686 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:01.920689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.920683442 +0000 UTC m=+40.737463785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:16:02.115844 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.115798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" event={"ID":"87e0334c-0350-4896-8f0a-f8f03953749a","Type":"ContainerStarted","Data":"ae705a6baf1b4472f032dbb266b68b508d188de367c6f6579d672e6d41a3e67a"} Apr 22 14:16:02.142102 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.142056 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lbhgd" podStartSLOduration=4.261410594 podStartE2EDuration="36.142043322s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="2026-04-22 14:15:27.140787541 +0000 UTC m=+1.957567888" lastFinishedPulling="2026-04-22 14:15:59.02142026 +0000 UTC m=+33.838200616" observedRunningTime="2026-04-22 14:16:02.140371368 +0000 UTC m=+36.957151732" watchObservedRunningTime="2026-04-22 14:16:02.142043322 +0000 UTC m=+36.958823687" Apr 22 14:16:02.323449 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.323419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:16:02.326386 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.326354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/274b8db4-5e01-406a-b732-06e1a0f63ab2-original-pull-secret\") pod \"global-pull-secret-syncer-7bgl8\" (UID: \"274b8db4-5e01-406a-b732-06e1a0f63ab2\") " pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:16:02.607381 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.607302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgl8" Apr 22 14:16:02.751702 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:02.751541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7bgl8"] Apr 22 14:16:02.754914 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:16:02.754879 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274b8db4_5e01_406a_b732_06e1a0f63ab2.slice/crio-472989f28652b6478b05733778239a871bb7951f928fed56dcf0445c4b2de27b WatchSource:0}: Error finding container 472989f28652b6478b05733778239a871bb7951f928fed56dcf0445c4b2de27b: Status 404 returned error can't find the container with id 472989f28652b6478b05733778239a871bb7951f928fed56dcf0445c4b2de27b Apr 22 14:16:03.022286 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.022214 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj"] Apr 22 14:16:03.055759 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.055732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj"] Apr 22 14:16:03.055929 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.055851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.059362 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.059342 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:16:03.060442 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.060422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vd4kk\"" Apr 22 14:16:03.060625 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.060611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:16:03.060694 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.060675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 14:16:03.060745 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.060706 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:16:03.118359 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.118324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7bgl8" event={"ID":"274b8db4-5e01-406a-b732-06e1a0f63ab2","Type":"ContainerStarted","Data":"472989f28652b6478b05733778239a871bb7951f928fed56dcf0445c4b2de27b"} Apr 22 14:16:03.228412 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.228380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7md8\" (UniqueName: \"kubernetes.io/projected/57d6baf5-437b-4343-94a8-93b21909b3b0-kube-api-access-q7md8\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.228589 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.228465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57d6baf5-437b-4343-94a8-93b21909b3b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.329073 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.329040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7md8\" (UniqueName: \"kubernetes.io/projected/57d6baf5-437b-4343-94a8-93b21909b3b0-kube-api-access-q7md8\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.329251 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.329100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57d6baf5-437b-4343-94a8-93b21909b3b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.332796 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.332762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57d6baf5-437b-4343-94a8-93b21909b3b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.337755 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.337730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7md8\" (UniqueName: \"kubernetes.io/projected/57d6baf5-437b-4343-94a8-93b21909b3b0-kube-api-access-q7md8\") pod \"managed-serviceaccount-addon-agent-56bcb89c67-h5skj\" (UID: \"57d6baf5-437b-4343-94a8-93b21909b3b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.376623 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.376595 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" Apr 22 14:16:03.501995 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:03.501964 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj"] Apr 22 14:16:03.505420 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:16:03.505382 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57d6baf5_437b_4343_94a8_93b21909b3b0.slice/crio-9952e07f11929fe753ac165425333895874a6d728a3f792881b7f8648da5abaf WatchSource:0}: Error finding container 9952e07f11929fe753ac165425333895874a6d728a3f792881b7f8648da5abaf: Status 404 returned error can't find the container with id 9952e07f11929fe753ac165425333895874a6d728a3f792881b7f8648da5abaf Apr 22 14:16:04.122178 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:04.122138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" event={"ID":"57d6baf5-437b-4343-94a8-93b21909b3b0","Type":"ContainerStarted","Data":"9952e07f11929fe753ac165425333895874a6d728a3f792881b7f8648da5abaf"} Apr 22 14:16:05.950266 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:05.950232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:16:05.950743 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:05.950281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:16:05.950743 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:05.950382 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:05.950743 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:05.950429 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:05.950743 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:05.950454 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.950436699 +0000 UTC m=+48.767217063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:16:05.950743 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:05.950476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.950460519 +0000 UTC m=+48.767240863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:09.133234 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:09.133198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" event={"ID":"57d6baf5-437b-4343-94a8-93b21909b3b0","Type":"ContainerStarted","Data":"601a4a2727638f3f7fc8f12490001c00e7a8354a46f99884d35a69afaa7f343e"} Apr 22 14:16:09.134408 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:09.134386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7bgl8" event={"ID":"274b8db4-5e01-406a-b732-06e1a0f63ab2","Type":"ContainerStarted","Data":"1e20a80621fbc79797f27bb42e43a68e0e09065fc139b9bccf1601452288ec76"} Apr 22 14:16:09.149074 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:09.149025 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" podStartSLOduration=1.450965954 podStartE2EDuration="6.14900935s" podCreationTimestamp="2026-04-22 14:16:03 +0000 UTC" firstStartedPulling="2026-04-22 14:16:03.507623143 +0000 UTC m=+38.324403487" lastFinishedPulling="2026-04-22 14:16:08.205666525 +0000 UTC m=+43.022446883" observedRunningTime="2026-04-22 14:16:09.148181775 +0000 UTC m=+43.964962139" watchObservedRunningTime="2026-04-22 14:16:09.14900935 +0000 UTC m=+43.965789709" Apr 22 14:16:09.163045 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:09.162995 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7bgl8" podStartSLOduration=33.703247616 podStartE2EDuration="39.162979405s" podCreationTimestamp="2026-04-22 14:15:30 +0000 UTC" firstStartedPulling="2026-04-22 14:16:02.756366607 +0000 UTC m=+37.573146951" lastFinishedPulling="2026-04-22 14:16:08.216098396 +0000 UTC m=+43.032878740" observedRunningTime="2026-04-22 14:16:09.162370881 +0000 UTC m=+43.979151247" watchObservedRunningTime="2026-04-22 14:16:09.162979405 +0000 UTC m=+43.979759770" Apr 22 14:16:14.009249 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:14.009203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:16:14.009249 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:14.009255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:16:14.009720 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:14.009349 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:14.009720 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:14.009364 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:14.009720 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:14.009402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.009387993 +0000 UTC m=+64.826168336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:14.009720 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:14.009439 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:30.009413211 +0000 UTC m=+64.826193756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:16:23.105424 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:23.105390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbtw8" Apr 22 14:16:30.018876 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.018842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:16:30.019239 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.018888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:16:30.019239 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.018977 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:30.019239 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.018981 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:30.019239 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.019036 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:02.019021852 +0000 UTC m=+96.835802195 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:16:30.019239 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.019048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:02.01904303 +0000 UTC m=+96.835823373 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:16:30.523328 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.523293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:16:30.526778 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.526759 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:30.534329 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.534307 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:30.534416 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:16:30.534378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:34.534361051 +0000 UTC m=+129.351141400 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : secret "metrics-daemon-secret" not found Apr 22 14:16:30.623830 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.623779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:16:30.626951 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.626931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:30.637626 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.637607 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:30.647307 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.647286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5h5r\" (UniqueName: \"kubernetes.io/projected/0f845cc3-634e-4134-8f72-6e6eb367d773-kube-api-access-v5h5r\") pod \"network-check-target-xj22k\" (UID: \"0f845cc3-634e-4134-8f72-6e6eb367d773\") " pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:16:30.815669 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.815595 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jjvkf\"" Apr 22 14:16:30.822887 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.822867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:16:30.937950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:30.937918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xj22k"] Apr 22 14:16:30.940645 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:16:30.940619 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f845cc3_634e_4134_8f72_6e6eb367d773.slice/crio-a2a8d06786739f9bb21bb297e6cd76d074feaef41143824e4283a879b93bc270 WatchSource:0}: Error finding container a2a8d06786739f9bb21bb297e6cd76d074feaef41143824e4283a879b93bc270: Status 404 returned error can't find the container with id a2a8d06786739f9bb21bb297e6cd76d074feaef41143824e4283a879b93bc270 Apr 22 14:16:31.180352 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:31.180319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xj22k" event={"ID":"0f845cc3-634e-4134-8f72-6e6eb367d773","Type":"ContainerStarted","Data":"a2a8d06786739f9bb21bb297e6cd76d074feaef41143824e4283a879b93bc270"} Apr 22 14:16:34.186489 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:34.186454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xj22k" event={"ID":"0f845cc3-634e-4134-8f72-6e6eb367d773","Type":"ContainerStarted","Data":"cb7465a5d8ddc6910f50565974ad16a22152c71d75ae8172ea44b5c440f6e687"} Apr 22 14:16:34.186871 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:34.186579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:16:34.219750 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:16:34.219696 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xj22k" podStartSLOduration=66.719288791 podStartE2EDuration="1m9.21968224s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:16:30.942464483 +0000 UTC m=+65.759244826" lastFinishedPulling="2026-04-22 14:16:33.442857929 +0000 UTC m=+68.259638275" observedRunningTime="2026-04-22 14:16:34.216449157 +0000 UTC m=+69.033229522" watchObservedRunningTime="2026-04-22 14:16:34.21968224 +0000 UTC m=+69.036462633" Apr 22 14:17:02.046021 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:02.045971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:17:02.046021 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:02.046028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:17:02.046475 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:02.046111 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:17:02.046475 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:02.046114 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:17:02.046475 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:02.046177 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert podName:faa0ae94-53cb-46ba-af35-0e690ed5b286 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:06.046159631 +0000 UTC m=+160.862939974 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert") pod "ingress-canary-kcjkd" (UID: "faa0ae94-53cb-46ba-af35-0e690ed5b286") : secret "canary-serving-cert" not found Apr 22 14:17:02.046475 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:02.046191 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls podName:d9fbf8cd-bdeb-41fe-ab55-46ff4906722e nodeName:}" failed. No retries permitted until 2026-04-22 14:18:06.046185357 +0000 UTC m=+160.862965700 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls") pod "dns-default-tr25t" (UID: "d9fbf8cd-bdeb-41fe-ab55-46ff4906722e") : secret "dns-default-metrics-tls" not found Apr 22 14:17:05.190967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:05.190939 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xj22k" Apr 22 14:17:34.567081 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:34.567036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:17:34.567642 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:34.567214 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:34.567642 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:34.567314 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs podName:7d49b78a-27ae-4f41-a759-29b898bf6fe1 nodeName:}" failed. No retries permitted until 2026-04-22 14:19:36.567290995 +0000 UTC m=+251.384071338 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs") pod "network-metrics-daemon-dngb2" (UID: "7d49b78a-27ae-4f41-a759-29b898bf6fe1") : secret "metrics-daemon-secret" not found Apr 22 14:17:38.008359 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.008314 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-548fb84bdd-rjs5x"] Apr 22 14:17:38.010238 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.010222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.012866 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.012845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 14:17:38.013228 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vnk9b\"" Apr 22 14:17:38.013228 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 14:17:38.013372 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013254 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.013372 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 14:17:38.013535 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013518 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 14:17:38.013939 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.013922 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.022291 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.022266 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-548fb84bdd-rjs5x"] Apr 22 14:17:38.093618 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.093579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.093782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.093640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-default-certificate\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.093782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.093684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-stats-auth\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.093782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.093736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.093782 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.093757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxds\" (UniqueName: \"kubernetes.io/projected/eae92029-ce85-4f67-9ea7-939a019950c7-kube-api-access-btxds\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.112961 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.112923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-62jld"] Apr 22 14:17:38.114803 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.114787 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4"] Apr 22 14:17:38.114944 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.114928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.116506 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.116481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.118150 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.118129 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.118659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.118641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.119548 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.119532 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:38.119604 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.119574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 14:17:38.119779 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.119762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 14:17:38.119953 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.119938 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 14:17:38.120379 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.120359 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tg2fj\"" Apr 22 14:17:38.121655 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.121636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:38.122683 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.122660 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-9lm5g\"" Apr 22 14:17:38.123578 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.123560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 14:17:38.133010 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.132988 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 14:17:38.136015 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.135991 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4"] Apr 22 14:17:38.138678 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.138656 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-62jld"] Apr 22 14:17:38.194303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.194303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-config\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc685723-3a77-43be-b573-ae8ca5c62f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc685723-3a77-43be-b573-ae8ca5c62f4a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-default-certificate\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-stats-auth\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzck\" (UniqueName: \"kubernetes.io/projected/b9478150-727c-42e1-b8be-a5fcc142a5b7-kube-api-access-kmzck\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.194427 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.194532 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.194509 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:38.694488016 +0000 UTC m=+133.511268363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : secret "router-metrics-certs-default" not found Apr 22 14:17:38.194916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194544 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-trusted-ca\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.194916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btxds\" (UniqueName: \"kubernetes.io/projected/eae92029-ce85-4f67-9ea7-939a019950c7-kube-api-access-btxds\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.194916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9478150-727c-42e1-b8be-a5fcc142a5b7-serving-cert\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.194916 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.194642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:38.69462345 +0000 UTC m=+133.511403812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:38.194916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.194676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfvz\" (UniqueName: \"kubernetes.io/projected/cc685723-3a77-43be-b573-ae8ca5c62f4a-kube-api-access-zrfvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.196832 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.196796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-default-certificate\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.196923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.196845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-stats-auth\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.208927 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.208904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxds\" (UniqueName: \"kubernetes.io/projected/eae92029-ce85-4f67-9ea7-939a019950c7-kube-api-access-btxds\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.217589 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.217569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:17:38.219731 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.219717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.223493 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.223471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:17:38.223587 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.223573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:17:38.223673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.223658 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:17:38.223803 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.223789 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x72w4\"" Apr 22 14:17:38.230983 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.230966 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:17:38.237503 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.237485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:17:38.295055 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.294976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295055 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc685723-3a77-43be-b573-ae8ca5c62f4a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.295250 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzck\" (UniqueName: \"kubernetes.io/projected/b9478150-727c-42e1-b8be-a5fcc142a5b7-kube-api-access-kmzck\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.295250 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295250 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc685723-3a77-43be-b573-ae8ca5c62f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.295250 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-config\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.295250 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97gc\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-trusted-ca\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.295480 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295699 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9478150-727c-42e1-b8be-a5fcc142a5b7-serving-cert\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.295699 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrfvz\" (UniqueName: \"kubernetes.io/projected/cc685723-3a77-43be-b573-ae8ca5c62f4a-kube-api-access-zrfvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.295699 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc685723-3a77-43be-b573-ae8ca5c62f4a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.295699 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.295927 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.295906 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-config\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.296118 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.296100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9478150-727c-42e1-b8be-a5fcc142a5b7-trusted-ca\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.297730 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.297710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc685723-3a77-43be-b573-ae8ca5c62f4a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.297886 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.297869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9478150-727c-42e1-b8be-a5fcc142a5b7-serving-cert\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.304299 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.304273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzck\" (UniqueName: \"kubernetes.io/projected/b9478150-727c-42e1-b8be-a5fcc142a5b7-kube-api-access-kmzck\") pod \"console-operator-9d4b6777b-62jld\" (UID: \"b9478150-727c-42e1-b8be-a5fcc142a5b7\") " pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.304447 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.304432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrfvz\" (UniqueName: \"kubernetes.io/projected/cc685723-3a77-43be-b573-ae8ca5c62f4a-kube-api-access-zrfvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-6rwp4\" (UID: \"cc685723-3a77-43be-b573-ae8ca5c62f4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.396288 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396288 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p97gc\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.396391 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.396414 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84479cf759-29dvm: secret "image-registry-tls" not found Apr 22 14:17:38.396520 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.396478 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls podName:43d17f3f-034e-4b82-8486-ae98d79cef85 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:38.896454794 +0000 UTC m=+133.713235137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls") pod "image-registry-84479cf759-29dvm" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85") : secret "image-registry-tls" not found Apr 22 14:17:38.396884 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396884 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.396884 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.397030 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.396963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.398254 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.398220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.398763 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.398744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.399003 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.398985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.405710 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.405689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.406123 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.406106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97gc\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.424477 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.424454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:38.430837 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.430794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" Apr 22 14:17:38.546867 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.546774 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-62jld"] Apr 22 14:17:38.549552 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:38.549498 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9478150_727c_42e1_b8be_a5fcc142a5b7.slice/crio-acc7bd57fb3af92610a8dc7c40c5907742597c0b7c3698ca2dfec178af496e4a WatchSource:0}: Error finding container acc7bd57fb3af92610a8dc7c40c5907742597c0b7c3698ca2dfec178af496e4a: Status 404 returned error can't find the container with id acc7bd57fb3af92610a8dc7c40c5907742597c0b7c3698ca2dfec178af496e4a Apr 22 14:17:38.561424 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.561381 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4"] Apr 22 14:17:38.564619 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:38.564594 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc685723_3a77_43be_b573_ae8ca5c62f4a.slice/crio-3e124a48e27ebcfe9696d3558f1fbf2c5fa23c213807b5260e3a4b4ad32e773e WatchSource:0}: Error finding container 3e124a48e27ebcfe9696d3558f1fbf2c5fa23c213807b5260e3a4b4ad32e773e: Status 404 returned error can't find the container with id 3e124a48e27ebcfe9696d3558f1fbf2c5fa23c213807b5260e3a4b4ad32e773e Apr 22 14:17:38.699136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.699104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.699339 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.699179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:38.699339 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.699271 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:38.699339 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.699307 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:39.699282731 +0000 UTC m=+134.516063075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:38.699339 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.699334 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:39.699325911 +0000 UTC m=+134.516106259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : secret "router-metrics-certs-default" not found Apr 22 14:17:38.901184 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:38.901150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:38.901392 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.901325 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:38.901392 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.901349 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84479cf759-29dvm: secret "image-registry-tls" not found Apr 22 14:17:38.901506 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:38.901414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls podName:43d17f3f-034e-4b82-8486-ae98d79cef85 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:39.90139452 +0000 UTC m=+134.718174867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls") pod "image-registry-84479cf759-29dvm" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85") : secret "image-registry-tls" not found Apr 22 14:17:39.314234 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.314127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" event={"ID":"b9478150-727c-42e1-b8be-a5fcc142a5b7","Type":"ContainerStarted","Data":"acc7bd57fb3af92610a8dc7c40c5907742597c0b7c3698ca2dfec178af496e4a"} Apr 22 14:17:39.315899 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.315865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" event={"ID":"cc685723-3a77-43be-b573-ae8ca5c62f4a","Type":"ContainerStarted","Data":"3e124a48e27ebcfe9696d3558f1fbf2c5fa23c213807b5260e3a4b4ad32e773e"} Apr 22 14:17:39.708040 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.707999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:39.708242 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.708098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:39.708307 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.708228 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:39.708371 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.708308 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.708282719 +0000 UTC m=+136.525063086 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:39.708371 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.708345 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.708334602 +0000 UTC m=+136.525114951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : secret "router-metrics-certs-default" not found Apr 22 14:17:39.839726 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.839690 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-82gp2"] Apr 22 14:17:39.841743 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.841728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:39.844795 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.844774 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jxh77\"" Apr 22 14:17:39.844898 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.844866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 14:17:39.845893 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.845876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 14:17:39.850885 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.850865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-82gp2"] Apr 22 14:17:39.910593 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:39.910554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:39.910777 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.910713 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:39.910777 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.910734 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84479cf759-29dvm: secret "image-registry-tls" not found Apr 22 14:17:39.910921 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:39.910797 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls podName:43d17f3f-034e-4b82-8486-ae98d79cef85 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.910776943 +0000 UTC m=+136.727557287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls") pod "image-registry-84479cf759-29dvm" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85") : secret "image-registry-tls" not found Apr 22 14:17:40.012009 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.011919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.012166 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.012077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bed76823-3878-431a-8956-8c2da1fe873b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.113517 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.113477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.113698 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.113564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bed76823-3878-431a-8956-8c2da1fe873b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.113698 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:40.113651 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:40.113812 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:40.113736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert podName:bed76823-3878-431a-8956-8c2da1fe873b nodeName:}" failed. No retries permitted until 2026-04-22 14:17:40.613713678 +0000 UTC m=+135.430494022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-82gp2" (UID: "bed76823-3878-431a-8956-8c2da1fe873b") : secret "networking-console-plugin-cert" not found Apr 22 14:17:40.114297 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.114276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bed76823-3878-431a-8956-8c2da1fe873b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.617950 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:40.617904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:40.618397 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:40.618074 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:40.618397 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:40.618145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert podName:bed76823-3878-431a-8956-8c2da1fe873b nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.618129624 +0000 UTC m=+136.434909967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-82gp2" (UID: "bed76823-3878-431a-8956-8c2da1fe873b") : secret "networking-console-plugin-cert" not found Apr 22 14:17:41.321316 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.321290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/0.log" Apr 22 14:17:41.321502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.321329 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9478150-727c-42e1-b8be-a5fcc142a5b7" containerID="e6093f71bcc1f151853d4000086dd630fe8d50af73444fd6a09a2bb4b0e391b3" exitCode=255 Apr 22 14:17:41.321502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.321395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" event={"ID":"b9478150-727c-42e1-b8be-a5fcc142a5b7","Type":"ContainerDied","Data":"e6093f71bcc1f151853d4000086dd630fe8d50af73444fd6a09a2bb4b0e391b3"} Apr 22 14:17:41.321686 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.321644 2576 scope.go:117] "RemoveContainer" containerID="e6093f71bcc1f151853d4000086dd630fe8d50af73444fd6a09a2bb4b0e391b3" Apr 22 14:17:41.322784 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.322758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" event={"ID":"cc685723-3a77-43be-b573-ae8ca5c62f4a","Type":"ContainerStarted","Data":"d5a9bb6a311f7494b5296cb33e6808ad7a36356cb1014704f55373aa06a143cd"} Apr 22 14:17:41.359982 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.359942 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" podStartSLOduration=1.1344884020000001 podStartE2EDuration="3.359927599s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="2026-04-22 14:17:38.566285845 +0000 UTC m=+133.383066202" lastFinishedPulling="2026-04-22 14:17:40.791725056 +0000 UTC m=+135.608505399" observedRunningTime="2026-04-22 14:17:41.35892277 +0000 UTC m=+136.175703136" watchObservedRunningTime="2026-04-22 14:17:41.359927599 +0000 UTC m=+136.176708017" Apr 22 14:17:41.626833 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.626718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:41.627167 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.626911 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:41.627167 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.626993 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert podName:bed76823-3878-431a-8956-8c2da1fe873b nodeName:}" failed. No retries permitted until 2026-04-22 14:17:43.626972188 +0000 UTC m=+138.443752537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-82gp2" (UID: "bed76823-3878-431a-8956-8c2da1fe873b") : secret "networking-console-plugin-cert" not found Apr 22 14:17:41.727457 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.727414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:41.727621 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.727500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:41.727621 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.727561 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:41.727701 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.727626 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.727610436 +0000 UTC m=+140.544390779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : secret "router-metrics-certs-default" not found Apr 22 14:17:41.727701 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.727651 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.727635802 +0000 UTC m=+140.544416149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:41.930155 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:41.930063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:41.930312 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.930210 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:41.930312 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.930229 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84479cf759-29dvm: secret "image-registry-tls" not found Apr 22 14:17:41.930312 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:41.930281 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls podName:43d17f3f-034e-4b82-8486-ae98d79cef85 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.930266875 +0000 UTC m=+140.747047218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls") pod "image-registry-84479cf759-29dvm" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85") : secret "image-registry-tls" not found Apr 22 14:17:42.326153 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326127 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:17:42.326534 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326515 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/0.log" Apr 22 14:17:42.326587 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326561 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9478150-727c-42e1-b8be-a5fcc142a5b7" containerID="ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7" exitCode=255 Apr 22 14:17:42.326667 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" event={"ID":"b9478150-727c-42e1-b8be-a5fcc142a5b7","Type":"ContainerDied","Data":"ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7"} Apr 22 14:17:42.326706 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326688 2576 scope.go:117] "RemoveContainer" containerID="e6093f71bcc1f151853d4000086dd630fe8d50af73444fd6a09a2bb4b0e391b3" Apr 22 14:17:42.326944 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.326924 2576 scope.go:117] "RemoveContainer" containerID="ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7" Apr 22 14:17:42.327155 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:42.327129 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-62jld_openshift-console-operator(b9478150-727c-42e1-b8be-a5fcc142a5b7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" podUID="b9478150-727c-42e1-b8be-a5fcc142a5b7" Apr 22 14:17:42.944808 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.944773 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph"] Apr 22 14:17:42.947050 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.947035 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" Apr 22 14:17:42.949940 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.949921 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 14:17:42.951104 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.951083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:42.951196 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.951089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vqkzk\"" Apr 22 14:17:42.962632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:42.962607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph"] Apr 22 14:17:43.038677 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.038644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvb5\" (UniqueName: \"kubernetes.io/projected/7ed77f3f-5234-43d6-b54b-0922379dce13-kube-api-access-5dvb5\") pod \"migrator-74bb7799d9-j2xph\" (UID: \"7ed77f3f-5234-43d6-b54b-0922379dce13\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" Apr 22 14:17:43.139438 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.139403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvb5\" (UniqueName: \"kubernetes.io/projected/7ed77f3f-5234-43d6-b54b-0922379dce13-kube-api-access-5dvb5\") pod \"migrator-74bb7799d9-j2xph\" (UID: \"7ed77f3f-5234-43d6-b54b-0922379dce13\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" Apr 22 14:17:43.148553 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.148525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvb5\" (UniqueName: \"kubernetes.io/projected/7ed77f3f-5234-43d6-b54b-0922379dce13-kube-api-access-5dvb5\") pod \"migrator-74bb7799d9-j2xph\" (UID: \"7ed77f3f-5234-43d6-b54b-0922379dce13\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" Apr 22 14:17:43.258196 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.258109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" Apr 22 14:17:43.330201 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.330060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:17:43.330728 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.330528 2576 scope.go:117] "RemoveContainer" containerID="ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7" Apr 22 14:17:43.330962 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:43.330916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-62jld_openshift-console-operator(b9478150-727c-42e1-b8be-a5fcc142a5b7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" podUID="b9478150-727c-42e1-b8be-a5fcc142a5b7" Apr 22 14:17:43.373844 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.373754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph"] Apr 22 14:17:43.377347 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:43.377317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed77f3f_5234_43d6_b54b_0922379dce13.slice/crio-f44e15ce4a61c701ac83d7c14df4988faa15a028a442df6941e59baf941e6e89 WatchSource:0}: Error finding container f44e15ce4a61c701ac83d7c14df4988faa15a028a442df6941e59baf941e6e89: Status 404 returned error can't find the container with id f44e15ce4a61c701ac83d7c14df4988faa15a028a442df6941e59baf941e6e89 Apr 22 14:17:43.642483 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:43.642446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:43.642627 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:43.642553 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:43.642627 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:43.642605 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert podName:bed76823-3878-431a-8956-8c2da1fe873b nodeName:}" failed. No retries permitted until 2026-04-22 14:17:47.642592453 +0000 UTC m=+142.459372796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-82gp2" (UID: "bed76823-3878-431a-8956-8c2da1fe873b") : secret "networking-console-plugin-cert" not found Apr 22 14:17:44.183521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:44.183493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lhmjg_7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba/dns-node-resolver/0.log" Apr 22 14:17:44.334137 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:44.334096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" event={"ID":"7ed77f3f-5234-43d6-b54b-0922379dce13","Type":"ContainerStarted","Data":"f44e15ce4a61c701ac83d7c14df4988faa15a028a442df6941e59baf941e6e89"} Apr 22 14:17:44.797429 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:44.797352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9rd7w_377748a7-900a-4086-b92d-5dcf4538b46f/node-ca/0.log" Apr 22 14:17:45.338425 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.338390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" event={"ID":"7ed77f3f-5234-43d6-b54b-0922379dce13","Type":"ContainerStarted","Data":"8b395533214b8ad9ead16d40e746e6b6c07b847972e5032256ca5a962a3ecf27"} Apr 22 14:17:45.338425 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.338431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" event={"ID":"7ed77f3f-5234-43d6-b54b-0922379dce13","Type":"ContainerStarted","Data":"f222dd9357b8eff5a520d21d249d74f45ce942fa36051c762ef8c56fe5184641"} Apr 22 14:17:45.363577 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.363518 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-j2xph" podStartSLOduration=2.274410844 podStartE2EDuration="3.36350055s" podCreationTimestamp="2026-04-22 14:17:42 +0000 UTC" firstStartedPulling="2026-04-22 14:17:43.379360291 +0000 UTC m=+138.196140634" lastFinishedPulling="2026-04-22 14:17:44.468449993 +0000 UTC m=+139.285230340" observedRunningTime="2026-04-22 14:17:45.362652419 +0000 UTC m=+140.179432792" watchObservedRunningTime="2026-04-22 14:17:45.36350055 +0000 UTC m=+140.180280913" Apr 22 14:17:45.760184 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.760092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:45.760184 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.760161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:45.760385 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.760252 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:45.760385 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.760272 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.76025829 +0000 UTC m=+148.577038633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:45.760385 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.760304 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs podName:eae92029-ce85-4f67-9ea7-939a019950c7 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.760290465 +0000 UTC m=+148.577070808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs") pod "router-default-548fb84bdd-rjs5x" (UID: "eae92029-ce85-4f67-9ea7-939a019950c7") : secret "router-metrics-certs-default" not found Apr 22 14:17:45.961922 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:45.961884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:45.962126 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.962052 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:45.962126 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.962076 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84479cf759-29dvm: secret "image-registry-tls" not found Apr 22 14:17:45.962235 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:45.962149 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls podName:43d17f3f-034e-4b82-8486-ae98d79cef85 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.962127348 +0000 UTC m=+148.778907697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls") pod "image-registry-84479cf759-29dvm" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85") : secret "image-registry-tls" not found Apr 22 14:17:47.678237 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:47.678192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:47.678617 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:47.678338 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:47.678617 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:47.678417 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert podName:bed76823-3878-431a-8956-8c2da1fe873b nodeName:}" failed. No retries permitted until 2026-04-22 14:17:55.678397389 +0000 UTC m=+150.495177732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-82gp2" (UID: "bed76823-3878-431a-8956-8c2da1fe873b") : secret "networking-console-plugin-cert" not found Apr 22 14:17:48.424742 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:48.424701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:48.424742 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:48.424737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:17:48.425129 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:48.425116 2576 scope.go:117] "RemoveContainer" containerID="ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7" Apr 22 14:17:48.425296 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:17:48.425280 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-62jld_openshift-console-operator(b9478150-727c-42e1-b8be-a5fcc142a5b7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" podUID="b9478150-727c-42e1-b8be-a5fcc142a5b7" Apr 22 14:17:53.827490 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:53.827429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:53.827913 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:53.827528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:53.828060 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:53.828036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eae92029-ce85-4f67-9ea7-939a019950c7-service-ca-bundle\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:53.829734 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:53.829715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eae92029-ce85-4f67-9ea7-939a019950c7-metrics-certs\") pod \"router-default-548fb84bdd-rjs5x\" (UID: \"eae92029-ce85-4f67-9ea7-939a019950c7\") " pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:53.918983 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:53.918937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:54.029502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.029465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:54.031811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.031792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"image-registry-84479cf759-29dvm\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:54.038340 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.038314 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-548fb84bdd-rjs5x"] Apr 22 14:17:54.041268 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:54.041237 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeae92029_ce85_4f67_9ea7_939a019950c7.slice/crio-eae22224863ade34d258c217825f07af255e5614d23bb8b8bc4edecff714a73d WatchSource:0}: Error finding container eae22224863ade34d258c217825f07af255e5614d23bb8b8bc4edecff714a73d: Status 404 returned error can't find the container with id eae22224863ade34d258c217825f07af255e5614d23bb8b8bc4edecff714a73d Apr 22 14:17:54.129191 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.129156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:54.247289 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.247258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:17:54.250232 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:54.250194 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d17f3f_034e_4b82_8486_ae98d79cef85.slice/crio-94987be011a5e4336a0ab5585333710d2f21754b34953bb9646d0ed22ba44930 WatchSource:0}: Error finding container 94987be011a5e4336a0ab5585333710d2f21754b34953bb9646d0ed22ba44930: Status 404 returned error can't find the container with id 94987be011a5e4336a0ab5585333710d2f21754b34953bb9646d0ed22ba44930 Apr 22 14:17:54.362135 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.362044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84479cf759-29dvm" event={"ID":"43d17f3f-034e-4b82-8486-ae98d79cef85","Type":"ContainerStarted","Data":"8f38258fb5400adb59c32fabac8911302cdf760f42b6e820d5d2658ab7063c77"} Apr 22 14:17:54.362135 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.362092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84479cf759-29dvm" event={"ID":"43d17f3f-034e-4b82-8486-ae98d79cef85","Type":"ContainerStarted","Data":"94987be011a5e4336a0ab5585333710d2f21754b34953bb9646d0ed22ba44930"} Apr 22 14:17:54.362348 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.362132 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:17:54.363503 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.363475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" event={"ID":"eae92029-ce85-4f67-9ea7-939a019950c7","Type":"ContainerStarted","Data":"d6f6b71b8d566f541c0d1198336b0dc7ae56b4738f72e4cbf998ac3720fdcd33"} Apr 22 14:17:54.363503 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.363511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" event={"ID":"eae92029-ce85-4f67-9ea7-939a019950c7","Type":"ContainerStarted","Data":"eae22224863ade34d258c217825f07af255e5614d23bb8b8bc4edecff714a73d"} Apr 22 14:17:54.386999 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.386949 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84479cf759-29dvm" podStartSLOduration=16.386933368 podStartE2EDuration="16.386933368s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:54.385992331 +0000 UTC m=+149.202772699" watchObservedRunningTime="2026-04-22 14:17:54.386933368 +0000 UTC m=+149.203713733" Apr 22 14:17:54.409907 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.409853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" podStartSLOduration=17.409838511 podStartE2EDuration="17.409838511s" podCreationTimestamp="2026-04-22 14:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:54.408360779 +0000 UTC m=+149.225141141" watchObservedRunningTime="2026-04-22 14:17:54.409838511 +0000 UTC m=+149.226618871" Apr 22 14:17:54.919317 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.919277 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:54.921719 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:54.921697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:55.367418 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.367383 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:55.368618 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.368599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-548fb84bdd-rjs5x" Apr 22 14:17:55.744202 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.744118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:55.746451 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.746422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bed76823-3878-431a-8956-8c2da1fe873b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-82gp2\" (UID: \"bed76823-3878-431a-8956-8c2da1fe873b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:55.751245 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.751222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" Apr 22 14:17:55.866719 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:55.866687 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-82gp2"] Apr 22 14:17:55.869503 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:17:55.869472 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbed76823_3878_431a_8956_8c2da1fe873b.slice/crio-95ecbb74426f6d5a93ebe8a714542a98d815a8d3c87821432253401040b169a1 WatchSource:0}: Error finding container 95ecbb74426f6d5a93ebe8a714542a98d815a8d3c87821432253401040b169a1: Status 404 returned error can't find the container with id 95ecbb74426f6d5a93ebe8a714542a98d815a8d3c87821432253401040b169a1 Apr 22 14:17:56.373877 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:56.373842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" event={"ID":"bed76823-3878-431a-8956-8c2da1fe873b","Type":"ContainerStarted","Data":"95ecbb74426f6d5a93ebe8a714542a98d815a8d3c87821432253401040b169a1"} Apr 22 14:17:57.377931 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:57.377890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" event={"ID":"bed76823-3878-431a-8956-8c2da1fe873b","Type":"ContainerStarted","Data":"ae4843866032c658b3fb2b42c5278e35139335c4b67197e2ccf1f5fb445ef9cf"} Apr 22 14:17:57.397921 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:17:57.397881 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-82gp2" podStartSLOduration=17.430876901 podStartE2EDuration="18.397867883s" podCreationTimestamp="2026-04-22 14:17:39 +0000 UTC" firstStartedPulling="2026-04-22 14:17:55.871280908 +0000 UTC m=+150.688061256" lastFinishedPulling="2026-04-22 14:17:56.838271895 +0000 UTC m=+151.655052238" observedRunningTime="2026-04-22 14:17:57.396901145 +0000 UTC m=+152.213681509" watchObservedRunningTime="2026-04-22 14:17:57.397867883 +0000 UTC m=+152.214648250" Apr 22 14:18:00.893221 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:00.893189 2576 scope.go:117] "RemoveContainer" containerID="ca3c465444c3d46e674fe2dde1f454388f43cd4115825adf8edc35d82c51f8b7" Apr 22 14:18:01.132879 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:01.132841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kcjkd" podUID="faa0ae94-53cb-46ba-af35-0e690ed5b286" Apr 22 14:18:01.156057 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:01.155996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tr25t" podUID="d9fbf8cd-bdeb-41fe-ab55-46ff4906722e" Apr 22 14:18:01.389534 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.389508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:18:01.389660 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.389600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:18:01.389660 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.389604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" event={"ID":"b9478150-727c-42e1-b8be-a5fcc142a5b7","Type":"ContainerStarted","Data":"2246d85ebb3ae4adc73aa2d0c5056eec60d98051384155b8b8784493bc9cf584"} Apr 22 14:18:01.389938 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.389923 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:18:01.408725 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.408640 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" podStartSLOduration=21.169872367 podStartE2EDuration="23.408628986s" podCreationTimestamp="2026-04-22 14:17:38 +0000 UTC" firstStartedPulling="2026-04-22 14:17:38.551481867 +0000 UTC m=+133.368262210" lastFinishedPulling="2026-04-22 14:17:40.790238476 +0000 UTC m=+135.607018829" observedRunningTime="2026-04-22 14:18:01.407878718 +0000 UTC m=+156.224659083" watchObservedRunningTime="2026-04-22 14:18:01.408628986 +0000 UTC m=+156.225409350" Apr 22 14:18:01.668509 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:01.668428 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-62jld" Apr 22 14:18:02.901524 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:02.901479 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dngb2" podUID="7d49b78a-27ae-4f41-a759-29b898bf6fe1" Apr 22 14:18:06.125104 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.125024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:18:06.125104 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.125077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:06.127253 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.127229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9fbf8cd-bdeb-41fe-ab55-46ff4906722e-metrics-tls\") pod \"dns-default-tr25t\" (UID: \"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e\") " pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:06.127434 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.127416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa0ae94-53cb-46ba-af35-0e690ed5b286-cert\") pod \"ingress-canary-kcjkd\" (UID: \"faa0ae94-53cb-46ba-af35-0e690ed5b286\") " pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:18:06.192626 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.192593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lthk8\"" Apr 22 14:18:06.200062 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.200032 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kcjkd" Apr 22 14:18:06.313396 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.313363 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kcjkd"] Apr 22 14:18:06.316125 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:06.316093 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa0ae94_53cb_46ba_af35_0e690ed5b286.slice/crio-d14adf28e44b2bb430640aac27a6ad4fa896ac5e6c7a96ea8b31d0c0d584a67b WatchSource:0}: Error finding container d14adf28e44b2bb430640aac27a6ad4fa896ac5e6c7a96ea8b31d0c0d584a67b: Status 404 returned error can't find the container with id d14adf28e44b2bb430640aac27a6ad4fa896ac5e6c7a96ea8b31d0c0d584a67b Apr 22 14:18:06.402461 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:06.402374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kcjkd" event={"ID":"faa0ae94-53cb-46ba-af35-0e690ed5b286","Type":"ContainerStarted","Data":"d14adf28e44b2bb430640aac27a6ad4fa896ac5e6c7a96ea8b31d0c0d584a67b"} Apr 22 14:18:07.310160 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.310049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5d5wd"] Apr 22 14:18:07.313304 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.313280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.317106 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.316962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:18:07.317106 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.317055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zwk4n\"" Apr 22 14:18:07.318217 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.318192 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:18:07.318342 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.318301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:07.318441 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.318425 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:07.327758 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.327725 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:18:07.349241 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.349193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5d5wd"] Apr 22 14:18:07.436309 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.436268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4c787ee-643f-4845-a682-1653914d9f62-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.436502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.436362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c787ee-643f-4845-a682-1653914d9f62-data-volume\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.436502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.436396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4c787ee-643f-4845-a682-1653914d9f62-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.436502 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.436475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4c787ee-643f-4845-a682-1653914d9f62-crio-socket\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.436653 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.436521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qks69\" (UniqueName: \"kubernetes.io/projected/e4c787ee-643f-4845-a682-1653914d9f62-kube-api-access-qks69\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537018 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.536977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c787ee-643f-4845-a682-1653914d9f62-data-volume\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537218 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4c787ee-643f-4845-a682-1653914d9f62-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537218 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4c787ee-643f-4845-a682-1653914d9f62-crio-socket\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537218 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qks69\" (UniqueName: \"kubernetes.io/projected/e4c787ee-643f-4845-a682-1653914d9f62-kube-api-access-qks69\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537218 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4c787ee-643f-4845-a682-1653914d9f62-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537218 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4c787ee-643f-4845-a682-1653914d9f62-crio-socket\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537474 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4c787ee-643f-4845-a682-1653914d9f62-data-volume\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.537702 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.537648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4c787ee-643f-4845-a682-1653914d9f62-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.539918 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.539886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4c787ee-643f-4845-a682-1653914d9f62-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.548548 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.548523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qks69\" (UniqueName: \"kubernetes.io/projected/e4c787ee-643f-4845-a682-1653914d9f62-kube-api-access-qks69\") pod \"insights-runtime-extractor-5d5wd\" (UID: \"e4c787ee-643f-4845-a682-1653914d9f62\") " pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.625966 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.625884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5d5wd" Apr 22 14:18:07.800986 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:07.800932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5d5wd"] Apr 22 14:18:07.806042 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:07.805999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c787ee_643f_4845_a682_1653914d9f62.slice/crio-c206e797c0af2037f49f6d00649fb462deabe6b8db41abe4d17dcbaf4e643f36 WatchSource:0}: Error finding container c206e797c0af2037f49f6d00649fb462deabe6b8db41abe4d17dcbaf4e643f36: Status 404 returned error can't find the container with id c206e797c0af2037f49f6d00649fb462deabe6b8db41abe4d17dcbaf4e643f36 Apr 22 14:18:08.409046 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:08.409017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5d5wd" event={"ID":"e4c787ee-643f-4845-a682-1653914d9f62","Type":"ContainerStarted","Data":"e7cdde5baf8ba2b00f3a7a691f2d8ef0e9de2dcdbb18d5e60d819237b870f7d8"} Apr 22 14:18:08.409349 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:08.409054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5d5wd" event={"ID":"e4c787ee-643f-4845-a682-1653914d9f62","Type":"ContainerStarted","Data":"c206e797c0af2037f49f6d00649fb462deabe6b8db41abe4d17dcbaf4e643f36"} Apr 22 14:18:08.410234 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:08.410209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kcjkd" event={"ID":"faa0ae94-53cb-46ba-af35-0e690ed5b286","Type":"ContainerStarted","Data":"58f760d16139701c9c4810963fd34ae86e9f050b92cf7859724fc3a5e7929456"} Apr 22 14:18:08.442344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:08.442276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kcjkd" podStartSLOduration=129.051183509 podStartE2EDuration="2m10.442261531s" podCreationTimestamp="2026-04-22 14:15:58 +0000 UTC" firstStartedPulling="2026-04-22 14:18:06.317690192 +0000 UTC m=+161.134470535" lastFinishedPulling="2026-04-22 14:18:07.70876816 +0000 UTC m=+162.525548557" observedRunningTime="2026-04-22 14:18:08.441577974 +0000 UTC m=+163.258358339" watchObservedRunningTime="2026-04-22 14:18:08.442261531 +0000 UTC m=+163.259041896" Apr 22 14:18:09.414382 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:09.414349 2576 generic.go:358] "Generic (PLEG): container finished" podID="57d6baf5-437b-4343-94a8-93b21909b3b0" containerID="601a4a2727638f3f7fc8f12490001c00e7a8354a46f99884d35a69afaa7f343e" exitCode=255 Apr 22 14:18:09.414853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:09.414430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" event={"ID":"57d6baf5-437b-4343-94a8-93b21909b3b0","Type":"ContainerDied","Data":"601a4a2727638f3f7fc8f12490001c00e7a8354a46f99884d35a69afaa7f343e"} Apr 22 14:18:09.416213 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:09.416186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5d5wd" event={"ID":"e4c787ee-643f-4845-a682-1653914d9f62","Type":"ContainerStarted","Data":"866bbe33f078dce2ecff923d56372e52b727768db0204ff76660600fa316a408"} Apr 22 14:18:09.421234 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:09.421218 2576 scope.go:117] "RemoveContainer" containerID="601a4a2727638f3f7fc8f12490001c00e7a8354a46f99884d35a69afaa7f343e" Apr 22 14:18:10.420487 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:10.420452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-56bcb89c67-h5skj" event={"ID":"57d6baf5-437b-4343-94a8-93b21909b3b0","Type":"ContainerStarted","Data":"e407d78554c8356c7105ea412a0f5ef6a68b5805f046913b4e065f3008871a16"} Apr 22 14:18:10.422189 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:10.422162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5d5wd" event={"ID":"e4c787ee-643f-4845-a682-1653914d9f62","Type":"ContainerStarted","Data":"51960160a0360dc9ea480819185a55afa418083c336bf05108ec2f990c3c0296"} Apr 22 14:18:10.458028 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:10.457979 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5d5wd" podStartSLOduration=1.5565995639999999 podStartE2EDuration="3.457964482s" podCreationTimestamp="2026-04-22 14:18:07 +0000 UTC" firstStartedPulling="2026-04-22 14:18:07.863224108 +0000 UTC m=+162.680004452" lastFinishedPulling="2026-04-22 14:18:09.764589016 +0000 UTC m=+164.581369370" observedRunningTime="2026-04-22 14:18:10.457300269 +0000 UTC m=+165.274080634" watchObservedRunningTime="2026-04-22 14:18:10.457964482 +0000 UTC m=+165.274744903" Apr 22 14:18:15.893790 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:15.893758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:18:16.892178 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:16.892139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:16.895113 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:16.895094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpmwg\"" Apr 22 14:18:16.903270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:16.903249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:17.023312 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:17.023283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tr25t"] Apr 22 14:18:17.025889 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:17.025854 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fbf8cd_bdeb_41fe_ab55_46ff4906722e.slice/crio-41692cbffd609d0396a72e440962767b43d18ea4b682166f48099fb4d7d12240 WatchSource:0}: Error finding container 41692cbffd609d0396a72e440962767b43d18ea4b682166f48099fb4d7d12240: Status 404 returned error can't find the container with id 41692cbffd609d0396a72e440962767b43d18ea4b682166f48099fb4d7d12240 Apr 22 14:18:17.333321 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:17.333292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:18:17.441572 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:17.441537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tr25t" event={"ID":"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e","Type":"ContainerStarted","Data":"41692cbffd609d0396a72e440962767b43d18ea4b682166f48099fb4d7d12240"} Apr 22 14:18:18.192461 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.192438 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:18:18.194274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.194254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.196945 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.196928 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:18:18.198126 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.198107 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:18:18.198216 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.198160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:18:18.198328 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.198312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:18:18.198486 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.198471 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:18:18.198526 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.198514 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xv94w\"" Apr 22 14:18:18.199547 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.199531 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:18:18.199898 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.199882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:18:18.203282 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.203258 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:18:18.206688 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.206663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:18:18.324004 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.323967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324156 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324156 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324156 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d8jb\" (UniqueName: \"kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324156 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324156 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.324408 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.324269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.424805 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d8jb\" (UniqueName: \"kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.424965 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.424965 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.424965 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.424965 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.425186 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.424970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.425186 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.425009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.425637 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.425597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.425877 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.425855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.425964 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.425895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.426315 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.426298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.427401 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.427380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.427729 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.427711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.433022 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.433000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d8jb\" (UniqueName: \"kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb\") pod \"console-6d5b4f77ff-ksncj\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.445847 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.445759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tr25t" event={"ID":"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e","Type":"ContainerStarted","Data":"7485060751dc25ff6018b4e9262514336b2f135363b6d3282a339a6d1b4f095a"} Apr 22 14:18:18.445847 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.445794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tr25t" event={"ID":"d9fbf8cd-bdeb-41fe-ab55-46ff4906722e","Type":"ContainerStarted","Data":"c20f2dc95d8be8ce0b925e37b759ff3a613257ffc474810770bec2e8e960a933"} Apr 22 14:18:18.464209 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.464162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tr25t" podStartSLOduration=139.299985039 podStartE2EDuration="2m20.464147869s" podCreationTimestamp="2026-04-22 14:15:58 +0000 UTC" firstStartedPulling="2026-04-22 14:18:17.02752508 +0000 UTC m=+171.844305424" lastFinishedPulling="2026-04-22 14:18:18.191687898 +0000 UTC m=+173.008468254" observedRunningTime="2026-04-22 14:18:18.463863994 +0000 UTC m=+173.280644359" watchObservedRunningTime="2026-04-22 14:18:18.464147869 +0000 UTC m=+173.280928233" Apr 22 14:18:18.504178 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.504140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:18.619715 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:18.619681 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:18:18.622356 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:18.622321 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65864d2f_1a2b_46fc_bbf0_b36601eb7c6e.slice/crio-1067617376e4e2c3fa86ea6537da7a4a1ba5d6366bd84eef1c57ad486f17fb51 WatchSource:0}: Error finding container 1067617376e4e2c3fa86ea6537da7a4a1ba5d6366bd84eef1c57ad486f17fb51: Status 404 returned error can't find the container with id 1067617376e4e2c3fa86ea6537da7a4a1ba5d6366bd84eef1c57ad486f17fb51 Apr 22 14:18:19.452221 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:19.452178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5b4f77ff-ksncj" event={"ID":"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e","Type":"ContainerStarted","Data":"1067617376e4e2c3fa86ea6537da7a4a1ba5d6366bd84eef1c57ad486f17fb51"} Apr 22 14:18:19.452695 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:19.452419 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:21.459294 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:21.459256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5b4f77ff-ksncj" event={"ID":"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e","Type":"ContainerStarted","Data":"8af1c5bffe0d85916670e6eef9c709bddbf685e43f1f7f370278733183342f99"} Apr 22 14:18:21.478356 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:21.478308 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d5b4f77ff-ksncj" podStartSLOduration=1.07967215 podStartE2EDuration="3.478295025s" podCreationTimestamp="2026-04-22 14:18:18 +0000 UTC" firstStartedPulling="2026-04-22 14:18:18.624167246 +0000 UTC m=+173.440947589" lastFinishedPulling="2026-04-22 14:18:21.022790118 +0000 UTC m=+175.839570464" observedRunningTime="2026-04-22 14:18:21.477026242 +0000 UTC m=+176.293806601" watchObservedRunningTime="2026-04-22 14:18:21.478295025 +0000 UTC m=+176.295075401" Apr 22 14:18:28.505160 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:28.505122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:28.505621 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:28.505219 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:28.509670 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:28.509647 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:29.457483 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:29.457453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tr25t" Apr 22 14:18:29.483487 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:29.483453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:18:30.438737 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.438702 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv"] Apr 22 14:18:30.443323 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.443296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.446205 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.446184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zp2dl\"" Apr 22 14:18:30.446205 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.446200 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 14:18:30.446455 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.446440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:18:30.446592 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.446541 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:30.446592 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.446541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:18:30.447571 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.447556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:18:30.455838 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.455799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv"] Apr 22 14:18:30.528368 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.528325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.528555 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.528383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6psr\" (UniqueName: \"kubernetes.io/projected/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-kube-api-access-p6psr\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.528555 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.528474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.528555 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.528538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.560901 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.560871 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j26mv"] Apr 22 14:18:30.565636 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.565612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.569111 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.569090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:30.569371 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.569353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:30.569371 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.569365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:30.570110 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.570094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zgcjt\"" Apr 22 14:18:30.629122 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.629094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6psr\" (UniqueName: \"kubernetes.io/projected/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-kube-api-access-p6psr\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.629231 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.629143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.629287 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.629242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.629374 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.629355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.629684 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:30.629661 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 14:18:30.629795 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:30.629728 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls podName:9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:31.129710448 +0000 UTC m=+185.946490797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-m2xrv" (UID: "9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0") : secret "openshift-state-metrics-tls" not found Apr 22 14:18:30.629982 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.629962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.631527 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.631510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.639483 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.639462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6psr\" (UniqueName: \"kubernetes.io/projected/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-kube-api-access-p6psr\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:30.730078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.729996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-root\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfclm\" (UniqueName: \"kubernetes.io/projected/475697af-7e97-4fd4-b02c-caf6094bb0b3-kube-api-access-cfclm\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-tls\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-metrics-client-ca\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-sys\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-wtmp\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730427 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-textfile\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.730427 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.730292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830654 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-tls\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-metrics-client-ca\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-sys\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-wtmp\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-textfile\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.830853 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-root\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-sys\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-wtmp\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfclm\" (UniqueName: \"kubernetes.io/projected/475697af-7e97-4fd4-b02c-caf6094bb0b3-kube-api-access-cfclm\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831136 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.830960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/475697af-7e97-4fd4-b02c-caf6094bb0b3-root\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831439 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.831196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-textfile\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831499 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.831478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-metrics-client-ca\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.831554 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.831518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-accelerators-collector-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.833604 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.833574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-tls\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.833713 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.833650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/475697af-7e97-4fd4-b02c-caf6094bb0b3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.849892 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.849871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfclm\" (UniqueName: \"kubernetes.io/projected/475697af-7e97-4fd4-b02c-caf6094bb0b3-kube-api-access-cfclm\") pod \"node-exporter-j26mv\" (UID: \"475697af-7e97-4fd4-b02c-caf6094bb0b3\") " pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.873982 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:30.873958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j26mv" Apr 22 14:18:30.881851 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:30.881826 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475697af_7e97_4fd4_b02c_caf6094bb0b3.slice/crio-49bf80f5564618a64b035722f2b20fb80e534a8c8da64a8ab32b36bc5f4840d4 WatchSource:0}: Error finding container 49bf80f5564618a64b035722f2b20fb80e534a8c8da64a8ab32b36bc5f4840d4: Status 404 returned error can't find the container with id 49bf80f5564618a64b035722f2b20fb80e534a8c8da64a8ab32b36bc5f4840d4 Apr 22 14:18:31.134160 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:31.134120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:31.136421 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:31.136396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-m2xrv\" (UID: \"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:31.352387 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:31.352337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" Apr 22 14:18:31.485167 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:31.485116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv"] Apr 22 14:18:31.487013 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:31.486967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j26mv" event={"ID":"475697af-7e97-4fd4-b02c-caf6094bb0b3","Type":"ContainerStarted","Data":"49bf80f5564618a64b035722f2b20fb80e534a8c8da64a8ab32b36bc5f4840d4"} Apr 22 14:18:31.588884 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:31.588852 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ecbca1a_cf02_4c2f_8d14_cd9996ede0d0.slice/crio-103b0349a30cc25e496bfc06a69cf63786dfab3d1417e3e7f69e7d8c3990cea3 WatchSource:0}: Error finding container 103b0349a30cc25e496bfc06a69cf63786dfab3d1417e3e7f69e7d8c3990cea3: Status 404 returned error can't find the container with id 103b0349a30cc25e496bfc06a69cf63786dfab3d1417e3e7f69e7d8c3990cea3 Apr 22 14:18:32.348047 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.348003 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84479cf759-29dvm" podUID="43d17f3f-034e-4b82-8486-ae98d79cef85" containerName="registry" containerID="cri-o://8f38258fb5400adb59c32fabac8911302cdf760f42b6e820d5d2658ab7063c77" gracePeriod=30 Apr 22 14:18:32.491940 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.491893 2576 generic.go:358] "Generic (PLEG): container finished" podID="43d17f3f-034e-4b82-8486-ae98d79cef85" containerID="8f38258fb5400adb59c32fabac8911302cdf760f42b6e820d5d2658ab7063c77" exitCode=0 Apr 22 14:18:32.491940 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.491932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84479cf759-29dvm" event={"ID":"43d17f3f-034e-4b82-8486-ae98d79cef85","Type":"ContainerDied","Data":"8f38258fb5400adb59c32fabac8911302cdf760f42b6e820d5d2658ab7063c77"} Apr 22 14:18:32.494718 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.494623 2576 generic.go:358] "Generic (PLEG): container finished" podID="475697af-7e97-4fd4-b02c-caf6094bb0b3" containerID="53efd6c21ff150ce2fe6732dec9eb14715142072d22dc5dd31ccd767f78a7e70" exitCode=0 Apr 22 14:18:32.494718 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.494660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j26mv" event={"ID":"475697af-7e97-4fd4-b02c-caf6094bb0b3","Type":"ContainerDied","Data":"53efd6c21ff150ce2fe6732dec9eb14715142072d22dc5dd31ccd767f78a7e70"} Apr 22 14:18:32.496659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.496630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" event={"ID":"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0","Type":"ContainerStarted","Data":"b492eefca1b4d1382e9e5a1365e85f047a717daf765e3a10bf8d4b3838f407dc"} Apr 22 14:18:32.496770 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.496668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" event={"ID":"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0","Type":"ContainerStarted","Data":"24840c408a360d0a0a69ab787c016e9d6f84439114b9735f8d190e34a77c7aa6"} Apr 22 14:18:32.496770 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.496683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" event={"ID":"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0","Type":"ContainerStarted","Data":"103b0349a30cc25e496bfc06a69cf63786dfab3d1417e3e7f69e7d8c3990cea3"} Apr 22 14:18:32.753331 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.753308 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:18:32.850672 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p97gc\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.850799 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850693 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.850799 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850718 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.850799 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850736 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.850799 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850772 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.850799 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.851087 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850838 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.851087 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.850874 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca\") pod \"43d17f3f-034e-4b82-8486-ae98d79cef85\" (UID: \"43d17f3f-034e-4b82-8486-ae98d79cef85\") " Apr 22 14:18:32.851470 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.851438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:32.851565 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.851448 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:32.853450 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.853408 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:32.853450 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.853424 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc" (OuterVolumeSpecName: "kube-api-access-p97gc") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "kube-api-access-p97gc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:32.853613 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.853502 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:32.853726 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.853707 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:32.853726 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.853715 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:32.859754 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.859730 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "43d17f3f-034e-4b82-8486-ae98d79cef85" (UID: "43d17f3f-034e-4b82-8486-ae98d79cef85"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:18:32.951907 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951856 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p97gc\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-kube-api-access-p97gc\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.951907 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951901 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-image-registry-private-configuration\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.951907 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951912 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-certificates\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.951907 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951922 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-bound-sa-token\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.952181 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951931 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43d17f3f-034e-4b82-8486-ae98d79cef85-ca-trust-extracted\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.952181 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951939 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43d17f3f-034e-4b82-8486-ae98d79cef85-installation-pull-secrets\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.952181 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951948 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43d17f3f-034e-4b82-8486-ae98d79cef85-registry-tls\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:32.952181 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:32.951956 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43d17f3f-034e-4b82-8486-ae98d79cef85-trusted-ca\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:18:33.501640 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.501608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j26mv" event={"ID":"475697af-7e97-4fd4-b02c-caf6094bb0b3","Type":"ContainerStarted","Data":"51aa7e49c543a8bfa31e6b1fa31c147b8e48626e4ed1a19f06d005cc69a08369"} Apr 22 14:18:33.501640 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.501641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j26mv" event={"ID":"475697af-7e97-4fd4-b02c-caf6094bb0b3","Type":"ContainerStarted","Data":"de3a97f3ec7260c9a9500f9f5407ced852e0d0dafe0e816b211e51c00a161d45"} Apr 22 14:18:33.507432 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.507398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" event={"ID":"9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0","Type":"ContainerStarted","Data":"5e5898dc0b0b764e06cc1797e000035239aac4f00b988d0283541e73880fe70a"} Apr 22 14:18:33.508672 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.508646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84479cf759-29dvm" event={"ID":"43d17f3f-034e-4b82-8486-ae98d79cef85","Type":"ContainerDied","Data":"94987be011a5e4336a0ab5585333710d2f21754b34953bb9646d0ed22ba44930"} Apr 22 14:18:33.508794 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.508689 2576 scope.go:117] "RemoveContainer" containerID="8f38258fb5400adb59c32fabac8911302cdf760f42b6e820d5d2658ab7063c77" Apr 22 14:18:33.508794 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.508709 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84479cf759-29dvm" Apr 22 14:18:33.526006 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.525854 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j26mv" podStartSLOduration=2.793357184 podStartE2EDuration="3.525812268s" podCreationTimestamp="2026-04-22 14:18:30 +0000 UTC" firstStartedPulling="2026-04-22 14:18:30.883403209 +0000 UTC m=+185.700183552" lastFinishedPulling="2026-04-22 14:18:31.615858284 +0000 UTC m=+186.432638636" observedRunningTime="2026-04-22 14:18:33.524643026 +0000 UTC m=+188.341423416" watchObservedRunningTime="2026-04-22 14:18:33.525812268 +0000 UTC m=+188.342592636" Apr 22 14:18:33.540415 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.540386 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:18:33.544345 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.544324 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84479cf759-29dvm"] Apr 22 14:18:33.568059 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.568003 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-m2xrv" podStartSLOduration=2.562918126 podStartE2EDuration="3.567985434s" podCreationTimestamp="2026-04-22 14:18:30 +0000 UTC" firstStartedPulling="2026-04-22 14:18:31.741355875 +0000 UTC m=+186.558136218" lastFinishedPulling="2026-04-22 14:18:32.746423182 +0000 UTC m=+187.563203526" observedRunningTime="2026-04-22 14:18:33.567600342 +0000 UTC m=+188.384380705" watchObservedRunningTime="2026-04-22 14:18:33.567985434 +0000 UTC m=+188.384765801" Apr 22 14:18:33.896554 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:33.896523 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d17f3f-034e-4b82-8486-ae98d79cef85" path="/var/lib/kubelet/pods/43d17f3f-034e-4b82-8486-ae98d79cef85/volumes" Apr 22 14:18:35.252961 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.252923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g"] Apr 22 14:18:35.253426 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.253288 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43d17f3f-034e-4b82-8486-ae98d79cef85" containerName="registry" Apr 22 14:18:35.253426 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.253304 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d17f3f-034e-4b82-8486-ae98d79cef85" containerName="registry" Apr 22 14:18:35.253426 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.253380 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43d17f3f-034e-4b82-8486-ae98d79cef85" containerName="registry" Apr 22 14:18:35.255336 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.255317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:35.258684 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.258665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 14:18:35.258800 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.258767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5gq7r\"" Apr 22 14:18:35.267327 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.267304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g"] Apr 22 14:18:35.374970 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.374937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdp6g\" (UID: \"f63e052a-04bc-4d76-aa3d-2174723ea360\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:35.476164 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.476125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdp6g\" (UID: \"f63e052a-04bc-4d76-aa3d-2174723ea360\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:35.476312 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:35.476261 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 14:18:35.476391 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:18:35.476318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert podName:f63e052a-04bc-4d76-aa3d-2174723ea360 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:35.976303293 +0000 UTC m=+190.793083645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-wdp6g" (UID: "f63e052a-04bc-4d76-aa3d-2174723ea360") : secret "monitoring-plugin-cert" not found Apr 22 14:18:35.714694 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.714658 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-96d74d89d-4pplr"] Apr 22 14:18:35.716970 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.716950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.720281 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720253 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 14:18:35.720389 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720283 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8h8gw\"" Apr 22 14:18:35.720389 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720255 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 14:18:35.720389 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 14:18:35.720389 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720255 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 14:18:35.720594 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.720580 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 14:18:35.725678 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.725652 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 14:18:35.730270 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.730245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-96d74d89d-4pplr"] Apr 22 14:18:35.782550 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.782516 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:18:35.784569 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.784552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.802434 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.802408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:18:35.879752 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlpj6\" (UniqueName: \"kubernetes.io/projected/02d8894a-49b2-46be-a938-78b844b544d1-kube-api-access-vlpj6\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.879752 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.879963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.879963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.879963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-serving-certs-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.879963 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-federate-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.879973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-metrics-client-ca\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880078 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.880190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.880190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.880190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.880300 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.880224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxsd4\" (UniqueName: \"kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981201 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981201 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981201 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981473 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981473 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxsd4\" (UniqueName: \"kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981473 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlpj6\" (UniqueName: \"kubernetes.io/projected/02d8894a-49b2-46be-a938-78b844b544d1-kube-api-access-vlpj6\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981473 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981656 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981656 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.981767 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-serving-certs-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981767 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdp6g\" (UID: \"f63e052a-04bc-4d76-aa3d-2174723ea360\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:35.981767 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-federate-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-metrics-client-ca\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.981948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.981899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.982131 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.982026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.982444 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.982415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-serving-certs-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.982528 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.982453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.982586 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.982523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.983038 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.983014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.983155 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.983131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-metrics-client-ca\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.984652 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.984735 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.984803 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.984803 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.984944 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d8894a-49b2-46be-a938-78b844b544d1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.984944 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.984901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/02d8894a-49b2-46be-a938-78b844b544d1-federate-client-tls\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:35.985071 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.985053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f63e052a-04bc-4d76-aa3d-2174723ea360-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdp6g\" (UID: \"f63e052a-04bc-4d76-aa3d-2174723ea360\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:35.985269 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.985253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.990120 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.990100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxsd4\" (UniqueName: \"kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4\") pod \"console-54f4b45dfc-8v4q6\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:35.990472 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:35.990454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlpj6\" (UniqueName: \"kubernetes.io/projected/02d8894a-49b2-46be-a938-78b844b544d1-kube-api-access-vlpj6\") pod \"telemeter-client-96d74d89d-4pplr\" (UID: \"02d8894a-49b2-46be-a938-78b844b544d1\") " pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:36.027366 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.027325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" Apr 22 14:18:36.093316 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.093283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:36.164583 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.164545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:36.174496 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.174462 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-96d74d89d-4pplr"] Apr 22 14:18:36.180753 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:36.180724 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d8894a_49b2_46be_a938_78b844b544d1.slice/crio-23dc34bfaf1f4e15e7b1699a2eceef3c699dd7687cf845106f480dc7442bae5c WatchSource:0}: Error finding container 23dc34bfaf1f4e15e7b1699a2eceef3c699dd7687cf845106f480dc7442bae5c: Status 404 returned error can't find the container with id 23dc34bfaf1f4e15e7b1699a2eceef3c699dd7687cf845106f480dc7442bae5c Apr 22 14:18:36.248780 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.248696 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:18:36.252071 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:36.252037 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40db2d7a_1d9e_4234_9e61_2c6e6e14603a.slice/crio-30e83059b528c06ab22ff1cc943caa97759feafc47d93ab2ec4e4978b39c0996 WatchSource:0}: Error finding container 30e83059b528c06ab22ff1cc943caa97759feafc47d93ab2ec4e4978b39c0996: Status 404 returned error can't find the container with id 30e83059b528c06ab22ff1cc943caa97759feafc47d93ab2ec4e4978b39c0996 Apr 22 14:18:36.295491 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.295468 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g"] Apr 22 14:18:36.300939 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:36.300916 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63e052a_04bc_4d76_aa3d_2174723ea360.slice/crio-f1eb6061cd6e80d655454b6eda906ac669df9a4a01f919022b3e94a492746646 WatchSource:0}: Error finding container f1eb6061cd6e80d655454b6eda906ac669df9a4a01f919022b3e94a492746646: Status 404 returned error can't find the container with id f1eb6061cd6e80d655454b6eda906ac669df9a4a01f919022b3e94a492746646 Apr 22 14:18:36.521943 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.521846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" event={"ID":"02d8894a-49b2-46be-a938-78b844b544d1","Type":"ContainerStarted","Data":"23dc34bfaf1f4e15e7b1699a2eceef3c699dd7687cf845106f480dc7442bae5c"} Apr 22 14:18:36.523255 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.523228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4b45dfc-8v4q6" event={"ID":"40db2d7a-1d9e-4234-9e61-2c6e6e14603a","Type":"ContainerStarted","Data":"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a"} Apr 22 14:18:36.523383 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.523259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4b45dfc-8v4q6" event={"ID":"40db2d7a-1d9e-4234-9e61-2c6e6e14603a","Type":"ContainerStarted","Data":"30e83059b528c06ab22ff1cc943caa97759feafc47d93ab2ec4e4978b39c0996"} Apr 22 14:18:36.524293 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.524272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" event={"ID":"f63e052a-04bc-4d76-aa3d-2174723ea360","Type":"ContainerStarted","Data":"f1eb6061cd6e80d655454b6eda906ac669df9a4a01f919022b3e94a492746646"} Apr 22 14:18:36.543594 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.543548 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54f4b45dfc-8v4q6" podStartSLOduration=1.543535757 podStartE2EDuration="1.543535757s" podCreationTimestamp="2026-04-22 14:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:36.542971703 +0000 UTC m=+191.359752071" watchObservedRunningTime="2026-04-22 14:18:36.543535757 +0000 UTC m=+191.360316121" Apr 22 14:18:36.820711 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.820677 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:36.823527 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.823507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.828917 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.828888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 14:18:36.829149 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.828928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 14:18:36.829149 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.828893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 14:18:36.830505 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.830208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 14:18:36.830505 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.830308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9897r\"" Apr 22 14:18:36.830505 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.830366 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 14:18:36.830505 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.830426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 14:18:36.831204 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 14:18:36.831204 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831101 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 14:18:36.831204 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 14:18:36.831578 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 14:18:36.831866 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831844 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 14:18:36.831931 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.831881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:18:36.832401 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.832380 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eihsplr5au3iq\"" Apr 22 14:18:36.833894 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.833875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 14:18:36.922470 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.922443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:36.991499 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991673 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.991948 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlntl\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-kube-api-access-jlntl\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.992311 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.991973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.992311 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.992006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.992311 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.992055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:36.992311 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:36.992077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092659 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.092967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.092942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlntl\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-kube-api-access-jlntl\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093303 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.093674 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.093530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.094506 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.094197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.096714 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.096318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.096714 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.096669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.096893 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.096720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.096893 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.096741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.097922 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.097359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.098032 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.097970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.098366 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.098337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.099269 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.099240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5daecb2-d73a-4b73-b67c-09a111c66037-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100423 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100423 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100423 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100607 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5daecb2-d73a-4b73-b67c-09a111c66037-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100607 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.100607 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.100504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.102045 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.101974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5daecb2-d73a-4b73-b67c-09a111c66037-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.106983 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.106948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlntl\" (UniqueName: \"kubernetes.io/projected/e5daecb2-d73a-4b73-b67c-09a111c66037-kube-api-access-jlntl\") pod \"prometheus-k8s-0\" (UID: \"e5daecb2-d73a-4b73-b67c-09a111c66037\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.135875 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.135837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:37.298860 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.298807 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 14:18:37.305971 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:18:37.305909 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5daecb2_d73a_4b73_b67c_09a111c66037.slice/crio-ae00e6106a5fe633289b64361f7f5d54debb342f4b4486e9a6f15d08474cc1b6 WatchSource:0}: Error finding container ae00e6106a5fe633289b64361f7f5d54debb342f4b4486e9a6f15d08474cc1b6: Status 404 returned error can't find the container with id ae00e6106a5fe633289b64361f7f5d54debb342f4b4486e9a6f15d08474cc1b6 Apr 22 14:18:37.528929 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:37.528835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"ae00e6106a5fe633289b64361f7f5d54debb342f4b4486e9a6f15d08474cc1b6"} Apr 22 14:18:38.533336 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:38.533233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" event={"ID":"02d8894a-49b2-46be-a938-78b844b544d1","Type":"ContainerStarted","Data":"544844ef812ab7dc7feb451c711593c12610548955727164da776115eecaf5bd"} Apr 22 14:18:38.534768 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:38.534736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" event={"ID":"f63e052a-04bc-4d76-aa3d-2174723ea360","Type":"ContainerStarted","Data":"508c4c84d523add6d133097cc465ab708f422fbfa871c6829792080142b7a316"} Apr 22 14:18:38.534967 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:38.534946 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:38.540618 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:38.540594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" Apr 22 14:18:38.551631 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:38.551586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdp6g" podStartSLOduration=1.765217778 podStartE2EDuration="3.551570835s" podCreationTimestamp="2026-04-22 14:18:35 +0000 UTC" firstStartedPulling="2026-04-22 14:18:36.302770527 +0000 UTC m=+191.119550870" lastFinishedPulling="2026-04-22 14:18:38.089123581 +0000 UTC m=+192.905903927" observedRunningTime="2026-04-22 14:18:38.55101126 +0000 UTC m=+193.367791625" watchObservedRunningTime="2026-04-22 14:18:38.551570835 +0000 UTC m=+193.368351200" Apr 22 14:18:39.538334 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:39.538297 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5daecb2-d73a-4b73-b67c-09a111c66037" containerID="ab6e4d65fddda0c3fce344c652ed3cf6e90da4e28ebbebf986c731bfef479c41" exitCode=0 Apr 22 14:18:39.538851 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:39.538387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerDied","Data":"ab6e4d65fddda0c3fce344c652ed3cf6e90da4e28ebbebf986c731bfef479c41"} Apr 22 14:18:39.540261 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:39.540232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" event={"ID":"02d8894a-49b2-46be-a938-78b844b544d1","Type":"ContainerStarted","Data":"0fdbac4545e7a52500da694a279b1992c8b9001251cd52a96a2abd475aef6768"} Apr 22 14:18:39.540322 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:39.540270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" event={"ID":"02d8894a-49b2-46be-a938-78b844b544d1","Type":"ContainerStarted","Data":"0f282f544ae4b5f60a4c36d2e2a52fdaedc154ce58d8d1414f6baa774b33bc21"} Apr 22 14:18:39.587497 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:39.587447 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-96d74d89d-4pplr" podStartSLOduration=2.017025452 podStartE2EDuration="4.587429386s" podCreationTimestamp="2026-04-22 14:18:35 +0000 UTC" firstStartedPulling="2026-04-22 14:18:36.183577337 +0000 UTC m=+191.000357693" lastFinishedPulling="2026-04-22 14:18:38.753981266 +0000 UTC m=+193.570761627" observedRunningTime="2026-04-22 14:18:39.586224973 +0000 UTC m=+194.403005336" watchObservedRunningTime="2026-04-22 14:18:39.587429386 +0000 UTC m=+194.404209749" Apr 22 14:18:40.305054 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:40.305017 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:18:43.554182 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:43.554147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"2ea3ea2483da3e492413dc5bf90cd16973bd66f840073fff81d25fe98c62434f"} Apr 22 14:18:43.554182 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:43.554187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"445eeb3dd404d159d2ad71b698e7efec254aa66969ce664fa2acdd75dac8b3c9"} Apr 22 14:18:44.563735 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:44.563701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"8b07c4a6cfe17cf163819893e83bb5e36d2f12aedcedbe10c41da90188a656fa"} Apr 22 14:18:45.568986 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:45.568951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"c7edb92fe5de0a880af36bbf0e996f750ac3309cebfe4349237027b000bacf8a"} Apr 22 14:18:45.568986 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:45.568986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"61244558bf86813fef922fcf9e33a6712186b5edfcb87fc824a463abc4d3bfce"} Apr 22 14:18:45.568986 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:45.568995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5daecb2-d73a-4b73-b67c-09a111c66037","Type":"ContainerStarted","Data":"231b550488c07dfd0fb337ed4d2263329f3e3098a909717562a447e3cddb2ab4"} Apr 22 14:18:45.601314 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:45.601250 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.432006102 podStartE2EDuration="9.60123022s" podCreationTimestamp="2026-04-22 14:18:36 +0000 UTC" firstStartedPulling="2026-04-22 14:18:37.308555588 +0000 UTC m=+192.125335935" lastFinishedPulling="2026-04-22 14:18:44.477779696 +0000 UTC m=+199.294560053" observedRunningTime="2026-04-22 14:18:45.598580571 +0000 UTC m=+200.415360936" watchObservedRunningTime="2026-04-22 14:18:45.60123022 +0000 UTC m=+200.418010585" Apr 22 14:18:46.093923 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:46.093878 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:18:47.136744 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:47.136707 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:18:51.516745 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:18:51.516712 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:19:05.329688 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.329624 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54f4b45dfc-8v4q6" podUID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" containerName="console" containerID="cri-o://7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a" gracePeriod=15 Apr 22 14:19:05.574939 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.574910 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54f4b45dfc-8v4q6_40db2d7a-1d9e-4234-9e61-2c6e6e14603a/console/0.log" Apr 22 14:19:05.575066 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.574983 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:19:05.625435 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54f4b45dfc-8v4q6_40db2d7a-1d9e-4234-9e61-2c6e6e14603a/console/0.log" Apr 22 14:19:05.625435 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625402 2576 generic.go:358] "Generic (PLEG): container finished" podID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" containerID="7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a" exitCode=2 Apr 22 14:19:05.625612 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625465 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54f4b45dfc-8v4q6" Apr 22 14:19:05.625612 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625485 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4b45dfc-8v4q6" event={"ID":"40db2d7a-1d9e-4234-9e61-2c6e6e14603a","Type":"ContainerDied","Data":"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a"} Apr 22 14:19:05.625612 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54f4b45dfc-8v4q6" event={"ID":"40db2d7a-1d9e-4234-9e61-2c6e6e14603a","Type":"ContainerDied","Data":"30e83059b528c06ab22ff1cc943caa97759feafc47d93ab2ec4e4978b39c0996"} Apr 22 14:19:05.625612 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.625549 2576 scope.go:117] "RemoveContainer" containerID="7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a" Apr 22 14:19:05.632854 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.632837 2576 scope.go:117] "RemoveContainer" containerID="7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a" Apr 22 14:19:05.633095 ip-10-0-139-83 kubenswrapper[2576]: E0422 14:19:05.633076 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a\": container with ID starting with 7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a not found: ID does not exist" containerID="7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a" Apr 22 14:19:05.633138 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.633105 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a"} err="failed to get container status \"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a\": rpc error: code = NotFound desc = could not find container \"7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a\": container with ID starting with 7f65c07ba3730bf97e3ec69a26a447a68ee4e567e73ce1ba699c950a6980960a not found: ID does not exist" Apr 22 14:19:05.665682 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665772 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665697 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665772 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665873 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665810 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxsd4\" (UniqueName: \"kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665910 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665871 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665947 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665901 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.665947 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.665943 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config\") pod \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\" (UID: \"40db2d7a-1d9e-4234-9e61-2c6e6e14603a\") " Apr 22 14:19:05.666135 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.666102 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config" (OuterVolumeSpecName: "console-config") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:05.666229 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.666122 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:05.666361 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.666283 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:05.666361 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.666321 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca" (OuterVolumeSpecName: "service-ca") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:05.667874 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.667850 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:05.667953 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.667890 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4" (OuterVolumeSpecName: "kube-api-access-jxsd4") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "kube-api-access-jxsd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:05.667998 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.667952 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "40db2d7a-1d9e-4234-9e61-2c6e6e14603a" (UID: "40db2d7a-1d9e-4234-9e61-2c6e6e14603a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:05.766946 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766899 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-service-ca\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.766946 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766940 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-oauth-serving-cert\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.766946 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766953 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-oauth-config\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.766946 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766964 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-trusted-ca-bundle\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.767249 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766974 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-serving-cert\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.767249 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766983 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-console-config\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.767249 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.766991 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxsd4\" (UniqueName: \"kubernetes.io/projected/40db2d7a-1d9e-4234-9e61-2c6e6e14603a-kube-api-access-jxsd4\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:05.942456 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.942422 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:19:05.947142 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:05.947116 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54f4b45dfc-8v4q6"] Apr 22 14:19:07.896048 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:07.896015 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" path="/var/lib/kubelet/pods/40db2d7a-1d9e-4234-9e61-2c6e6e14603a/volumes" Apr 22 14:19:12.645624 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:12.645588 2576 generic.go:358] "Generic (PLEG): container finished" podID="cc685723-3a77-43be-b573-ae8ca5c62f4a" containerID="d5a9bb6a311f7494b5296cb33e6808ad7a36356cb1014704f55373aa06a143cd" exitCode=0 Apr 22 14:19:12.646022 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:12.645635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" event={"ID":"cc685723-3a77-43be-b573-ae8ca5c62f4a","Type":"ContainerDied","Data":"d5a9bb6a311f7494b5296cb33e6808ad7a36356cb1014704f55373aa06a143cd"} Apr 22 14:19:12.646022 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:12.645991 2576 scope.go:117] "RemoveContainer" containerID="d5a9bb6a311f7494b5296cb33e6808ad7a36356cb1014704f55373aa06a143cd" Apr 22 14:19:13.652569 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:13.652535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6rwp4" event={"ID":"cc685723-3a77-43be-b573-ae8ca5c62f4a","Type":"ContainerStarted","Data":"52ff67288bb0e45b6f6b24fde0ebd9ee1fa9817de272662834d5f56fa949dce9"} Apr 22 14:19:16.537385 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.537343 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d5b4f77ff-ksncj" podUID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" containerName="console" containerID="cri-o://8af1c5bffe0d85916670e6eef9c709bddbf685e43f1f7f370278733183342f99" gracePeriod=15 Apr 22 14:19:16.664354 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.664153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5b4f77ff-ksncj_65864d2f-1a2b-46fc-bbf0-b36601eb7c6e/console/0.log" Apr 22 14:19:16.664354 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.664206 2576 generic.go:358] "Generic (PLEG): container finished" podID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" containerID="8af1c5bffe0d85916670e6eef9c709bddbf685e43f1f7f370278733183342f99" exitCode=2 Apr 22 14:19:16.664354 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.664293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5b4f77ff-ksncj" event={"ID":"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e","Type":"ContainerDied","Data":"8af1c5bffe0d85916670e6eef9c709bddbf685e43f1f7f370278733183342f99"} Apr 22 14:19:16.853318 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.853291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5b4f77ff-ksncj_65864d2f-1a2b-46fc-bbf0-b36601eb7c6e/console/0.log" Apr 22 14:19:16.853466 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.853351 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:19:16.976539 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976489 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976740 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976576 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976740 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976740 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976673 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d8jb\" (UniqueName: \"kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976740 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976714 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976970 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976749 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.976970 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.976777 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config\") pod \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\" (UID: \"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e\") " Apr 22 14:19:16.977425 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.977393 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config" (OuterVolumeSpecName: "console-config") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:16.977742 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.977718 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:16.978458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.978408 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca" (OuterVolumeSpecName: "service-ca") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:16.978570 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.978546 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:19:16.983016 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.982966 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:16.987082 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.986967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:19:16.987082 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:16.987046 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb" (OuterVolumeSpecName: "kube-api-access-5d8jb") pod "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" (UID: "65864d2f-1a2b-46fc-bbf0-b36601eb7c6e"). InnerVolumeSpecName "kube-api-access-5d8jb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:19:17.077632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077584 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-service-ca\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077632 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077627 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-oauth-serving-cert\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077915 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077645 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-config\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077915 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077659 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-trusted-ca-bundle\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077915 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077677 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-serving-cert\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077915 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077692 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-console-oauth-config\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.077915 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.077705 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5d8jb\" (UniqueName: \"kubernetes.io/projected/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e-kube-api-access-5d8jb\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:19:17.668790 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.668760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5b4f77ff-ksncj_65864d2f-1a2b-46fc-bbf0-b36601eb7c6e/console/0.log" Apr 22 14:19:17.669361 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.668847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5b4f77ff-ksncj" event={"ID":"65864d2f-1a2b-46fc-bbf0-b36601eb7c6e","Type":"ContainerDied","Data":"1067617376e4e2c3fa86ea6537da7a4a1ba5d6366bd84eef1c57ad486f17fb51"} Apr 22 14:19:17.669361 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.668893 2576 scope.go:117] "RemoveContainer" containerID="8af1c5bffe0d85916670e6eef9c709bddbf685e43f1f7f370278733183342f99" Apr 22 14:19:17.669361 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.668913 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5b4f77ff-ksncj" Apr 22 14:19:17.691755 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.691722 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:19:17.695318 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.695285 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d5b4f77ff-ksncj"] Apr 22 14:19:17.897088 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:17.897038 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" path="/var/lib/kubelet/pods/65864d2f-1a2b-46fc-bbf0-b36601eb7c6e/volumes" Apr 22 14:19:36.656309 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:36.656206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:19:36.658471 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:36.658444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d49b78a-27ae-4f41-a759-29b898bf6fe1-metrics-certs\") pod \"network-metrics-daemon-dngb2\" (UID: \"7d49b78a-27ae-4f41-a759-29b898bf6fe1\") " pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:19:36.897190 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:36.897158 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mfkfw\"" Apr 22 14:19:36.904091 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:36.904064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dngb2" Apr 22 14:19:37.026430 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:37.026394 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dngb2"] Apr 22 14:19:37.029341 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:19:37.029312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d49b78a_27ae_4f41_a759_29b898bf6fe1.slice/crio-5ea2de31edb0c449bfe16b38db613546d9503912cf8debdd159731721787ad1b WatchSource:0}: Error finding container 5ea2de31edb0c449bfe16b38db613546d9503912cf8debdd159731721787ad1b: Status 404 returned error can't find the container with id 5ea2de31edb0c449bfe16b38db613546d9503912cf8debdd159731721787ad1b Apr 22 14:19:37.137080 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:37.137046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:37.152334 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:37.152308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:37.727307 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:37.727265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dngb2" event={"ID":"7d49b78a-27ae-4f41-a759-29b898bf6fe1","Type":"ContainerStarted","Data":"5ea2de31edb0c449bfe16b38db613546d9503912cf8debdd159731721787ad1b"} Apr 22 14:19:37.743836 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:37.743774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 14:19:38.732068 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:38.732029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dngb2" event={"ID":"7d49b78a-27ae-4f41-a759-29b898bf6fe1","Type":"ContainerStarted","Data":"414f1306c30d6651f1aa19e920f44a3927eaeb69991879d1d5f2ceb3f1560fbd"} Apr 22 14:19:38.732068 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:38.732073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dngb2" event={"ID":"7d49b78a-27ae-4f41-a759-29b898bf6fe1","Type":"ContainerStarted","Data":"819c88c19ad3b085f163edfb2662d08fc2f64d6842d5d0540ef8838631b295f0"} Apr 22 14:19:38.748811 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:19:38.748758 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dngb2" podStartSLOduration=252.795818376 podStartE2EDuration="4m13.748740322s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:19:37.03121281 +0000 UTC m=+251.847993154" lastFinishedPulling="2026-04-22 14:19:37.984134757 +0000 UTC m=+252.800915100" observedRunningTime="2026-04-22 14:19:38.747566642 +0000 UTC m=+253.564347009" watchObservedRunningTime="2026-04-22 14:19:38.748740322 +0000 UTC m=+253.565520732" Apr 22 14:20:25.743771 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:20:25.743738 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:20:25.743771 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:20:25.743764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:20:25.758533 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:20:25.758505 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:24:39.602550 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-wxnt6"] Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602855 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" containerName="console" Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602870 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" containerName="console" Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602880 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" containerName="console" Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602886 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" containerName="console" Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602956 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="40db2d7a-1d9e-4234-9e61-2c6e6e14603a" containerName="console" Apr 22 14:24:39.603103 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.602965 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="65864d2f-1a2b-46fc-bbf0-b36601eb7c6e" containerName="console" Apr 22 14:24:39.605528 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.605512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wxnt6" Apr 22 14:24:39.608580 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.608541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m75pq\"" Apr 22 14:24:39.608674 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.608606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:24:39.608731 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.608717 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:24:39.609789 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.609768 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:24:39.614557 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.614535 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wxnt6"] Apr 22 14:24:39.677020 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.676978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdchl\" (UniqueName: \"kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl\") pod \"s3-init-wxnt6\" (UID: \"8dfd969a-9d00-4e71-981c-8000c07daf6a\") " pod="kserve/s3-init-wxnt6" Apr 22 14:24:39.777458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.777407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdchl\" (UniqueName: \"kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl\") pod \"s3-init-wxnt6\" (UID: \"8dfd969a-9d00-4e71-981c-8000c07daf6a\") " pod="kserve/s3-init-wxnt6" Apr 22 14:24:39.786702 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.786678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdchl\" (UniqueName: \"kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl\") pod \"s3-init-wxnt6\" (UID: \"8dfd969a-9d00-4e71-981c-8000c07daf6a\") " pod="kserve/s3-init-wxnt6" Apr 22 14:24:39.915084 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:39.914983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wxnt6" Apr 22 14:24:40.034344 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:40.034309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-wxnt6"] Apr 22 14:24:40.037506 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:24:40.037480 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfd969a_9d00_4e71_981c_8000c07daf6a.slice/crio-63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b WatchSource:0}: Error finding container 63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b: Status 404 returned error can't find the container with id 63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b Apr 22 14:24:40.039584 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:40.039568 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:24:40.576984 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:40.576928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wxnt6" event={"ID":"8dfd969a-9d00-4e71-981c-8000c07daf6a","Type":"ContainerStarted","Data":"63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b"} Apr 22 14:24:44.595419 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:44.595375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wxnt6" event={"ID":"8dfd969a-9d00-4e71-981c-8000c07daf6a","Type":"ContainerStarted","Data":"d8d0f84a1296e508de8e3ad461030ed502bb08619e68efc696581e3b0f7672e5"} Apr 22 14:24:44.619225 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:44.619157 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-wxnt6" podStartSLOduration=1.284701882 podStartE2EDuration="5.619139173s" podCreationTimestamp="2026-04-22 14:24:39 +0000 UTC" firstStartedPulling="2026-04-22 14:24:40.039750255 +0000 UTC m=+554.856530609" lastFinishedPulling="2026-04-22 14:24:44.374187554 +0000 UTC m=+559.190967900" observedRunningTime="2026-04-22 14:24:44.618008805 +0000 UTC m=+559.434789170" watchObservedRunningTime="2026-04-22 14:24:44.619139173 +0000 UTC m=+559.435919535" Apr 22 14:24:47.605481 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:47.605396 2576 generic.go:358] "Generic (PLEG): container finished" podID="8dfd969a-9d00-4e71-981c-8000c07daf6a" containerID="d8d0f84a1296e508de8e3ad461030ed502bb08619e68efc696581e3b0f7672e5" exitCode=0 Apr 22 14:24:47.605481 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:47.605434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wxnt6" event={"ID":"8dfd969a-9d00-4e71-981c-8000c07daf6a","Type":"ContainerDied","Data":"d8d0f84a1296e508de8e3ad461030ed502bb08619e68efc696581e3b0f7672e5"} Apr 22 14:24:48.733087 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:48.733058 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wxnt6" Apr 22 14:24:48.860513 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:48.860425 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdchl\" (UniqueName: \"kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl\") pod \"8dfd969a-9d00-4e71-981c-8000c07daf6a\" (UID: \"8dfd969a-9d00-4e71-981c-8000c07daf6a\") " Apr 22 14:24:48.862493 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:48.862463 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl" (OuterVolumeSpecName: "kube-api-access-zdchl") pod "8dfd969a-9d00-4e71-981c-8000c07daf6a" (UID: "8dfd969a-9d00-4e71-981c-8000c07daf6a"). InnerVolumeSpecName "kube-api-access-zdchl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:48.961379 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:48.961328 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdchl\" (UniqueName: \"kubernetes.io/projected/8dfd969a-9d00-4e71-981c-8000c07daf6a-kube-api-access-zdchl\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:24:49.612584 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:49.612547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-wxnt6" event={"ID":"8dfd969a-9d00-4e71-981c-8000c07daf6a","Type":"ContainerDied","Data":"63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b"} Apr 22 14:24:49.612584 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:49.612565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-wxnt6" Apr 22 14:24:49.612584 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:24:49.612580 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63e09616b5accd35926ba2124747c2038e89f58f788ab749da6fc1326b9c909b" Apr 22 14:25:00.122081 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.122042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-sr8c2"] Apr 22 14:25:00.122458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.122340 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dfd969a-9d00-4e71-981c-8000c07daf6a" containerName="s3-init" Apr 22 14:25:00.122458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.122377 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfd969a-9d00-4e71-981c-8000c07daf6a" containerName="s3-init" Apr 22 14:25:00.122458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.122426 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dfd969a-9d00-4e71-981c-8000c07daf6a" containerName="s3-init" Apr 22 14:25:00.125501 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.125484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:00.130942 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.130909 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:25:00.131077 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.130988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 14:25:00.132058 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.132038 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m75pq\"" Apr 22 14:25:00.132155 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.132041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:25:00.134841 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.134803 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-sr8c2"] Apr 22 14:25:00.248013 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.247969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdg6\" (UniqueName: \"kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6\") pod \"s3-tls-init-custom-sr8c2\" (UID: \"b422fb0f-a80b-4149-bd14-d82c76d2a785\") " pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:00.348808 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.348775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdg6\" (UniqueName: \"kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6\") pod \"s3-tls-init-custom-sr8c2\" (UID: \"b422fb0f-a80b-4149-bd14-d82c76d2a785\") " pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:00.359017 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.358979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdg6\" (UniqueName: \"kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6\") pod \"s3-tls-init-custom-sr8c2\" (UID: \"b422fb0f-a80b-4149-bd14-d82c76d2a785\") " pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:00.434580 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.434478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:00.553088 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.553051 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-sr8c2"] Apr 22 14:25:00.557036 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:25:00.557008 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb422fb0f_a80b_4149_bd14_d82c76d2a785.slice/crio-ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709 WatchSource:0}: Error finding container ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709: Status 404 returned error can't find the container with id ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709 Apr 22 14:25:00.644493 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.644451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-sr8c2" event={"ID":"b422fb0f-a80b-4149-bd14-d82c76d2a785","Type":"ContainerStarted","Data":"26007a7783574fa6909940b4a23f0e3de7e8d31ed9648d301fbb60235baadba0"} Apr 22 14:25:00.644493 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.644496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-sr8c2" event={"ID":"b422fb0f-a80b-4149-bd14-d82c76d2a785","Type":"ContainerStarted","Data":"ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709"} Apr 22 14:25:00.661419 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:00.661361 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-sr8c2" podStartSLOduration=0.661342734 podStartE2EDuration="661.342734ms" podCreationTimestamp="2026-04-22 14:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:25:00.660916354 +0000 UTC m=+575.477696720" watchObservedRunningTime="2026-04-22 14:25:00.661342734 +0000 UTC m=+575.478123104" Apr 22 14:25:05.663505 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:05.663470 2576 generic.go:358] "Generic (PLEG): container finished" podID="b422fb0f-a80b-4149-bd14-d82c76d2a785" containerID="26007a7783574fa6909940b4a23f0e3de7e8d31ed9648d301fbb60235baadba0" exitCode=0 Apr 22 14:25:05.663875 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:05.663541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-sr8c2" event={"ID":"b422fb0f-a80b-4149-bd14-d82c76d2a785","Type":"ContainerDied","Data":"26007a7783574fa6909940b4a23f0e3de7e8d31ed9648d301fbb60235baadba0"} Apr 22 14:25:06.796346 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:06.796321 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:06.907527 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:06.907499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksdg6\" (UniqueName: \"kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6\") pod \"b422fb0f-a80b-4149-bd14-d82c76d2a785\" (UID: \"b422fb0f-a80b-4149-bd14-d82c76d2a785\") " Apr 22 14:25:06.909521 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:06.909491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6" (OuterVolumeSpecName: "kube-api-access-ksdg6") pod "b422fb0f-a80b-4149-bd14-d82c76d2a785" (UID: "b422fb0f-a80b-4149-bd14-d82c76d2a785"). InnerVolumeSpecName "kube-api-access-ksdg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:25:07.008493 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:07.008402 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ksdg6\" (UniqueName: \"kubernetes.io/projected/b422fb0f-a80b-4149-bd14-d82c76d2a785-kube-api-access-ksdg6\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:25:07.670458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:07.670426 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-sr8c2" Apr 22 14:25:07.670458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:07.670431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-sr8c2" event={"ID":"b422fb0f-a80b-4149-bd14-d82c76d2a785","Type":"ContainerDied","Data":"ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709"} Apr 22 14:25:07.670458 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:07.670465 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecd5d40a8176bffa19f2c46345771a5117fe8afecf3a1bff677f080058a45709" Apr 22 14:25:10.402301 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.402268 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-hwb4t"] Apr 22 14:25:10.402792 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.402570 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b422fb0f-a80b-4149-bd14-d82c76d2a785" containerName="s3-tls-init-custom" Apr 22 14:25:10.402792 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.402581 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b422fb0f-a80b-4149-bd14-d82c76d2a785" containerName="s3-tls-init-custom" Apr 22 14:25:10.402792 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.402628 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b422fb0f-a80b-4149-bd14-d82c76d2a785" containerName="s3-tls-init-custom" Apr 22 14:25:10.404932 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.404914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:10.407734 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.407708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 14:25:10.407859 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.407743 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:25:10.408751 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.408731 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-m75pq\"" Apr 22 14:25:10.408879 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.408732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:25:10.413274 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.413250 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-hwb4t"] Apr 22 14:25:10.438583 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.438553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwffd\" (UniqueName: \"kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd\") pod \"s3-tls-init-serving-hwb4t\" (UID: \"e57e4ed2-4369-49b9-a63a-b7e6f890d207\") " pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:10.539369 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.539335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwffd\" (UniqueName: \"kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd\") pod \"s3-tls-init-serving-hwb4t\" (UID: \"e57e4ed2-4369-49b9-a63a-b7e6f890d207\") " pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:10.550878 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.550858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwffd\" (UniqueName: \"kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd\") pod \"s3-tls-init-serving-hwb4t\" (UID: \"e57e4ed2-4369-49b9-a63a-b7e6f890d207\") " pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:10.714008 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.713920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:10.834498 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:10.834461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-hwb4t"] Apr 22 14:25:10.837369 ip-10-0-139-83 kubenswrapper[2576]: W0422 14:25:10.837341 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode57e4ed2_4369_49b9_a63a_b7e6f890d207.slice/crio-67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5 WatchSource:0}: Error finding container 67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5: Status 404 returned error can't find the container with id 67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5 Apr 22 14:25:11.682630 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:11.682594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-hwb4t" event={"ID":"e57e4ed2-4369-49b9-a63a-b7e6f890d207","Type":"ContainerStarted","Data":"5c4dab6ab85a3b48157a11bbc253d1bf6907658c35fead9ba940e444aae9eb47"} Apr 22 14:25:11.682630 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:11.682630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-hwb4t" event={"ID":"e57e4ed2-4369-49b9-a63a-b7e6f890d207","Type":"ContainerStarted","Data":"67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5"} Apr 22 14:25:11.701192 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:11.701141 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-hwb4t" podStartSLOduration=1.701123044 podStartE2EDuration="1.701123044s" podCreationTimestamp="2026-04-22 14:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:25:11.700353406 +0000 UTC m=+586.517133767" watchObservedRunningTime="2026-04-22 14:25:11.701123044 +0000 UTC m=+586.517903407" Apr 22 14:25:15.696775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:15.696688 2576 generic.go:358] "Generic (PLEG): container finished" podID="e57e4ed2-4369-49b9-a63a-b7e6f890d207" containerID="5c4dab6ab85a3b48157a11bbc253d1bf6907658c35fead9ba940e444aae9eb47" exitCode=0 Apr 22 14:25:15.696775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:15.696760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-hwb4t" event={"ID":"e57e4ed2-4369-49b9-a63a-b7e6f890d207","Type":"ContainerDied","Data":"5c4dab6ab85a3b48157a11bbc253d1bf6907658c35fead9ba940e444aae9eb47"} Apr 22 14:25:16.827797 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:16.827775 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:16.886526 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:16.886492 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwffd\" (UniqueName: \"kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd\") pod \"e57e4ed2-4369-49b9-a63a-b7e6f890d207\" (UID: \"e57e4ed2-4369-49b9-a63a-b7e6f890d207\") " Apr 22 14:25:16.888579 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:16.888548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd" (OuterVolumeSpecName: "kube-api-access-vwffd") pod "e57e4ed2-4369-49b9-a63a-b7e6f890d207" (UID: "e57e4ed2-4369-49b9-a63a-b7e6f890d207"). InnerVolumeSpecName "kube-api-access-vwffd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:25:16.987812 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:16.987730 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwffd\" (UniqueName: \"kubernetes.io/projected/e57e4ed2-4369-49b9-a63a-b7e6f890d207-kube-api-access-vwffd\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 14:25:17.703680 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:17.703650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-hwb4t" Apr 22 14:25:17.703680 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:17.703660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-hwb4t" event={"ID":"e57e4ed2-4369-49b9-a63a-b7e6f890d207","Type":"ContainerDied","Data":"67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5"} Apr 22 14:25:17.703897 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:17.703699 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ccacd3841f73a26994be1760c2f8cfd58392da645e94f96e17c327e42b08c5" Apr 22 14:25:25.765627 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:25.765597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:25:25.770976 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:25:25.770953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:30:25.786916 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:30:25.786876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:30:25.792440 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:30:25.792406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:35:25.807007 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:35:25.806975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:35:25.813775 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:35:25.813745 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:40:25.827085 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:40:25.827053 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:40:25.834170 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:40:25.834149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:45:25.846131 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:45:25.846101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:45:25.859552 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:45:25.859526 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:50:25.865412 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:50:25.865385 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:50:25.880569 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:50:25.880547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:55:25.885025 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:55:25.884997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 14:55:25.906071 ip-10-0-139-83 kubenswrapper[2576]: I0422 14:55:25.906044 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:00:25.912388 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:00:25.912363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:00:25.928775 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:00:25.928753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:05:25.936621 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:05:25.936591 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:05:25.950103 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:05:25.950079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:10:25.956287 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:10:25.956261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:10:25.970719 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:10:25.970699 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:15:25.980335 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:25.980306 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:15:25.990796 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:25.990777 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:15:59.421107 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.421077 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxxdt/must-gather-9pbw6"] Apr 22 15:15:59.421588 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.421358 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e57e4ed2-4369-49b9-a63a-b7e6f890d207" containerName="s3-tls-init-serving" Apr 22 15:15:59.421588 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.421368 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57e4ed2-4369-49b9-a63a-b7e6f890d207" containerName="s3-tls-init-serving" Apr 22 15:15:59.421588 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.421434 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e57e4ed2-4369-49b9-a63a-b7e6f890d207" containerName="s3-tls-init-serving" Apr 22 15:15:59.429376 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.429342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.432249 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.432196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vxxdt\"/\"default-dockercfg-9ddtj\"" Apr 22 15:15:59.433383 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.433354 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxdt\"/\"kube-root-ca.crt\"" Apr 22 15:15:59.433528 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.433428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxdt\"/\"openshift-service-ca.crt\"" Apr 22 15:15:59.434834 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.434796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxdt/must-gather-9pbw6"] Apr 22 15:15:59.502896 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.502869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.503014 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.502914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7757l\" (UniqueName: \"kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.603752 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.603728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.603886 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.603771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7757l\" (UniqueName: \"kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.604037 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.604019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.611692 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.611675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7757l\" (UniqueName: \"kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l\") pod \"must-gather-9pbw6\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.739373 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.739307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:15:59.858316 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.858202 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxdt/must-gather-9pbw6"] Apr 22 15:15:59.860988 ip-10-0-139-83 kubenswrapper[2576]: W0422 15:15:59.860957 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9bbb2ed_8d98_4961_bd2b_7cba0965658f.slice/crio-a2e663e61449bb6b172b57fb0bd9531fdc8c4279c20b6a93bc78bc21625459d1 WatchSource:0}: Error finding container a2e663e61449bb6b172b57fb0bd9531fdc8c4279c20b6a93bc78bc21625459d1: Status 404 returned error can't find the container with id a2e663e61449bb6b172b57fb0bd9531fdc8c4279c20b6a93bc78bc21625459d1 Apr 22 15:15:59.862642 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:15:59.862627 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:16:00.246708 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:00.246673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" event={"ID":"b9bbb2ed-8d98-4961-bd2b-7cba0965658f","Type":"ContainerStarted","Data":"a2e663e61449bb6b172b57fb0bd9531fdc8c4279c20b6a93bc78bc21625459d1"} Apr 22 15:16:06.267726 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:06.267692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" event={"ID":"b9bbb2ed-8d98-4961-bd2b-7cba0965658f","Type":"ContainerStarted","Data":"8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973"} Apr 22 15:16:06.267726 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:06.267727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" event={"ID":"b9bbb2ed-8d98-4961-bd2b-7cba0965658f","Type":"ContainerStarted","Data":"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e"} Apr 22 15:16:06.285614 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:06.285561 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" podStartSLOduration=1.694710312 podStartE2EDuration="7.285546879s" podCreationTimestamp="2026-04-22 15:15:59 +0000 UTC" firstStartedPulling="2026-04-22 15:15:59.862768559 +0000 UTC m=+3634.679548903" lastFinishedPulling="2026-04-22 15:16:05.453605122 +0000 UTC m=+3640.270385470" observedRunningTime="2026-04-22 15:16:06.283620697 +0000 UTC m=+3641.100401063" watchObservedRunningTime="2026-04-22 15:16:06.285546879 +0000 UTC m=+3641.102327244" Apr 22 15:16:27.332086 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:27.332054 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerID="784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e" exitCode=0 Apr 22 15:16:27.332454 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:27.332130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" event={"ID":"b9bbb2ed-8d98-4961-bd2b-7cba0965658f","Type":"ContainerDied","Data":"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e"} Apr 22 15:16:27.332454 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:27.332436 2576 scope.go:117] "RemoveContainer" containerID="784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e" Apr 22 15:16:28.244019 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:28.243990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxxdt_must-gather-9pbw6_b9bbb2ed-8d98-4961-bd2b-7cba0965658f/gather/0.log" Apr 22 15:16:32.037437 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:32.037407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7bgl8_274b8db4-5e01-406a-b732-06e1a0f63ab2/global-pull-secret-syncer/0.log" Apr 22 15:16:32.203782 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:32.203658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g9d2z_1563b033-b9b8-425b-ab3d-4a3b05b42fec/konnectivity-agent/0.log" Apr 22 15:16:32.302385 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:32.302316 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-83.ec2.internal_4b55a5e2e0ee6a678340d55dba3cf653/haproxy/0.log" Apr 22 15:16:33.736600 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:33.736564 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vxxdt/must-gather-9pbw6"] Apr 22 15:16:33.737049 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:33.736796 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="copy" containerID="cri-o://8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973" gracePeriod=2 Apr 22 15:16:33.746034 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:33.745999 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vxxdt/must-gather-9pbw6"] Apr 22 15:16:33.958094 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:33.958072 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxxdt_must-gather-9pbw6_b9bbb2ed-8d98-4961-bd2b-7cba0965658f/copy/0.log" Apr 22 15:16:33.958388 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:33.958373 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:16:34.072498 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.072471 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7757l\" (UniqueName: \"kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l\") pod \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " Apr 22 15:16:34.072629 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.072540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output\") pod \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\" (UID: \"b9bbb2ed-8d98-4961-bd2b-7cba0965658f\") " Apr 22 15:16:34.073934 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.073912 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b9bbb2ed-8d98-4961-bd2b-7cba0965658f" (UID: "b9bbb2ed-8d98-4961-bd2b-7cba0965658f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:16:34.074511 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.074484 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l" (OuterVolumeSpecName: "kube-api-access-7757l") pod "b9bbb2ed-8d98-4961-bd2b-7cba0965658f" (UID: "b9bbb2ed-8d98-4961-bd2b-7cba0965658f"). InnerVolumeSpecName "kube-api-access-7757l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:16:34.173239 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.173207 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7757l\" (UniqueName: \"kubernetes.io/projected/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-kube-api-access-7757l\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 15:16:34.173239 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.173235 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9bbb2ed-8d98-4961-bd2b-7cba0965658f-must-gather-output\") on node \"ip-10-0-139-83.ec2.internal\" DevicePath \"\"" Apr 22 15:16:34.352361 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.352286 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vxxdt_must-gather-9pbw6_b9bbb2ed-8d98-4961-bd2b-7cba0965658f/copy/0.log" Apr 22 15:16:34.352632 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.352604 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerID="8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973" exitCode=143 Apr 22 15:16:34.352731 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.352651 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxdt/must-gather-9pbw6" Apr 22 15:16:34.352731 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.352708 2576 scope.go:117] "RemoveContainer" containerID="8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973" Apr 22 15:16:34.359869 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.359852 2576 scope.go:117] "RemoveContainer" containerID="784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e" Apr 22 15:16:34.371358 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.371341 2576 scope.go:117] "RemoveContainer" containerID="8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973" Apr 22 15:16:34.371612 ip-10-0-139-83 kubenswrapper[2576]: E0422 15:16:34.371595 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973\": container with ID starting with 8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973 not found: ID does not exist" containerID="8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973" Apr 22 15:16:34.371651 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.371622 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973"} err="failed to get container status \"8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973\": rpc error: code = NotFound desc = could not find container \"8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973\": container with ID starting with 8d72017b4e2683543742723a6c73f23bc7189f78f3dd35fce89e7385f691f973 not found: ID does not exist" Apr 22 15:16:34.371651 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.371642 2576 scope.go:117] "RemoveContainer" containerID="784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e" Apr 22 15:16:34.371883 ip-10-0-139-83 kubenswrapper[2576]: E0422 15:16:34.371866 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e\": container with ID starting with 784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e not found: ID does not exist" containerID="784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e" Apr 22 15:16:34.371927 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:34.371890 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e"} err="failed to get container status \"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e\": rpc error: code = NotFound desc = could not find container \"784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e\": container with ID starting with 784e8bd075e6b339e0cf2f098b30d2276513dab8d8251141c45580d945f78b1e not found: ID does not exist" Apr 22 15:16:35.449249 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.449174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wdp6g_f63e052a-04bc-4d76-aa3d-2174723ea360/monitoring-plugin/0.log" Apr 22 15:16:35.608262 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.608233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j26mv_475697af-7e97-4fd4-b02c-caf6094bb0b3/node-exporter/0.log" Apr 22 15:16:35.637234 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.637211 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j26mv_475697af-7e97-4fd4-b02c-caf6094bb0b3/kube-rbac-proxy/0.log" Apr 22 15:16:35.664112 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.664092 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j26mv_475697af-7e97-4fd4-b02c-caf6094bb0b3/init-textfile/0.log" Apr 22 15:16:35.777662 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.777597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m2xrv_9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0/kube-rbac-proxy-main/0.log" Apr 22 15:16:35.803362 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.803340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m2xrv_9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0/kube-rbac-proxy-self/0.log" Apr 22 15:16:35.830543 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.830521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-m2xrv_9ecbca1a-cf02-4c2f-8d14-cd9996ede0d0/openshift-state-metrics/0.log" Apr 22 15:16:35.884643 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.884621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/prometheus/0.log" Apr 22 15:16:35.896032 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.896004 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" path="/var/lib/kubelet/pods/b9bbb2ed-8d98-4961-bd2b-7cba0965658f/volumes" Apr 22 15:16:35.910869 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.910851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/config-reloader/0.log" Apr 22 15:16:35.947074 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.947058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/thanos-sidecar/0.log" Apr 22 15:16:35.977584 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:35.977568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/kube-rbac-proxy-web/0.log" Apr 22 15:16:36.009737 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.009723 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/kube-rbac-proxy/0.log" Apr 22 15:16:36.036663 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.036599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/kube-rbac-proxy-thanos/0.log" Apr 22 15:16:36.062765 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.062751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5daecb2-d73a-4b73-b67c-09a111c66037/init-config-reloader/0.log" Apr 22 15:16:36.183228 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.183200 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-96d74d89d-4pplr_02d8894a-49b2-46be-a938-78b844b544d1/telemeter-client/0.log" Apr 22 15:16:36.208618 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.208594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-96d74d89d-4pplr_02d8894a-49b2-46be-a938-78b844b544d1/reload/0.log" Apr 22 15:16:36.240341 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:36.240323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-96d74d89d-4pplr_02d8894a-49b2-46be-a938-78b844b544d1/kube-rbac-proxy/0.log" Apr 22 15:16:37.621149 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:37.621057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-82gp2_bed76823-3878-431a-8956-8c2da1fe873b/networking-console-plugin/0.log" Apr 22 15:16:38.147804 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:38.147729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/1.log" Apr 22 15:16:38.152254 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:38.152218 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-62jld_b9478150-727c-42e1-b8be-a5fcc142a5b7/console-operator/2.log" Apr 22 15:16:39.234031 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.233995 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d"] Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234276 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="copy" Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234286 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="copy" Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234307 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="gather" Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234312 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="gather" Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234354 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="copy" Apr 22 15:16:39.234370 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.234362 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9bbb2ed-8d98-4961-bd2b-7cba0965658f" containerName="gather" Apr 22 15:16:39.237560 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.237540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.240173 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.240149 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"kube-root-ca.crt\"" Apr 22 15:16:39.240368 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.240355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zm2sq\"/\"default-dockercfg-jkn8s\"" Apr 22 15:16:39.241302 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.241286 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zm2sq\"/\"openshift-service-ca.crt\"" Apr 22 15:16:39.247645 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.247623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d"] Apr 22 15:16:39.308700 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.308669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-podres\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.308700 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.308702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-proc\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.308907 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.308788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-sys\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.308907 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.308807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-lib-modules\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.308907 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.308854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhfh\" (UniqueName: \"kubernetes.io/projected/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-kube-api-access-sqhfh\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409585 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-sys\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409585 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-lib-modules\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqhfh\" (UniqueName: \"kubernetes.io/projected/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-kube-api-access-sqhfh\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-podres\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-proc\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-sys\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-lib-modules\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-proc\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.409778 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.409749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-podres\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.419219 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.419192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqhfh\" (UniqueName: \"kubernetes.io/projected/0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3-kube-api-access-sqhfh\") pod \"perf-node-gather-daemonset-8pn8d\" (UID: \"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3\") " pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.547769 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.547698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:39.666215 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.666189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d"] Apr 22 15:16:39.668643 ip-10-0-139-83 kubenswrapper[2576]: W0422 15:16:39.668605 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0093c6b5_58be_46ae_aa8a_0c5d1c41f0a3.slice/crio-c889ef848a8eac438f199c3a919d207fc3e73663cae960e235815aaf2a2de7a0 WatchSource:0}: Error finding container c889ef848a8eac438f199c3a919d207fc3e73663cae960e235815aaf2a2de7a0: Status 404 returned error can't find the container with id c889ef848a8eac438f199c3a919d207fc3e73663cae960e235815aaf2a2de7a0 Apr 22 15:16:39.800225 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.800105 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tr25t_d9fbf8cd-bdeb-41fe-ab55-46ff4906722e/dns/0.log" Apr 22 15:16:39.825672 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.825642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tr25t_d9fbf8cd-bdeb-41fe-ab55-46ff4906722e/kube-rbac-proxy/0.log" Apr 22 15:16:39.955651 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:39.955612 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lhmjg_7fd35d56-d453-4a46-9d90-e6d7f3ddd0ba/dns-node-resolver/0.log" Apr 22 15:16:40.374381 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:40.374342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" event={"ID":"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3","Type":"ContainerStarted","Data":"6fafab057666427c3a26abf15a2c2ba3ad550bc3d37736e1938f5eaddde4501e"} Apr 22 15:16:40.374381 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:40.374379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" event={"ID":"0093c6b5-58be-46ae-aa8a-0c5d1c41f0a3","Type":"ContainerStarted","Data":"c889ef848a8eac438f199c3a919d207fc3e73663cae960e235815aaf2a2de7a0"} Apr 22 15:16:40.374953 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:40.374404 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:40.391430 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:40.391385 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" podStartSLOduration=1.391366772 podStartE2EDuration="1.391366772s" podCreationTimestamp="2026-04-22 15:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:16:40.390535857 +0000 UTC m=+3675.207316223" watchObservedRunningTime="2026-04-22 15:16:40.391366772 +0000 UTC m=+3675.208147139" Apr 22 15:16:40.437528 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:40.437494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9rd7w_377748a7-900a-4086-b92d-5dcf4538b46f/node-ca/0.log" Apr 22 15:16:41.289870 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:41.289836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-548fb84bdd-rjs5x_eae92029-ce85-4f67-9ea7-939a019950c7/router/0.log" Apr 22 15:16:41.730739 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:41.730710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kcjkd_faa0ae94-53cb-46ba-af35-0e690ed5b286/serve-healthcheck-canary/0.log" Apr 22 15:16:42.188326 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:42.188295 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5d5wd_e4c787ee-643f-4845-a682-1653914d9f62/kube-rbac-proxy/0.log" Apr 22 15:16:42.227989 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:42.227962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5d5wd_e4c787ee-643f-4845-a682-1653914d9f62/exporter/0.log" Apr 22 15:16:42.300570 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:42.300547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5d5wd_e4c787ee-643f-4845-a682-1653914d9f62/extractor/0.log" Apr 22 15:16:44.956376 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:44.956351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-wxnt6_8dfd969a-9d00-4e71-981c-8000c07daf6a/s3-init/0.log" Apr 22 15:16:44.984964 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:44.984941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-sr8c2_b422fb0f-a80b-4149-bd14-d82c76d2a785/s3-tls-init-custom/0.log" Apr 22 15:16:45.011386 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:45.011358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-hwb4t_e57e4ed2-4369-49b9-a63a-b7e6f890d207/s3-tls-init-serving/0.log" Apr 22 15:16:46.387205 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:46.387177 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zm2sq/perf-node-gather-daemonset-8pn8d" Apr 22 15:16:49.043865 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:49.043831 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-j2xph_7ed77f3f-5234-43d6-b54b-0922379dce13/migrator/0.log" Apr 22 15:16:49.077168 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:49.077134 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-j2xph_7ed77f3f-5234-43d6-b54b-0922379dce13/graceful-termination/0.log" Apr 22 15:16:49.494341 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:49.494250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6rwp4_cc685723-3a77-43be-b573-ae8ca5c62f4a/kube-storage-version-migrator-operator/1.log" Apr 22 15:16:49.495229 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:49.495198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6rwp4_cc685723-3a77-43be-b573-ae8ca5c62f4a/kube-storage-version-migrator-operator/0.log" Apr 22 15:16:50.740973 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.740939 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/kube-multus-additional-cni-plugins/0.log" Apr 22 15:16:50.765358 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.765332 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/egress-router-binary-copy/0.log" Apr 22 15:16:50.790508 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.790487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/cni-plugins/0.log" Apr 22 15:16:50.817123 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.817106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/bond-cni-plugin/0.log" Apr 22 15:16:50.841127 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.841109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/routeoverride-cni/0.log" Apr 22 15:16:50.865231 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.865210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/whereabouts-cni-bincopy/0.log" Apr 22 15:16:50.891066 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:50.891044 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lbhgd_87e0334c-0350-4896-8f0a-f8f03953749a/whereabouts-cni/0.log" Apr 22 15:16:51.123986 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:51.123960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pn6dm_836b411b-66a6-4504-9937-fe987775439a/kube-multus/0.log" Apr 22 15:16:51.151898 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:51.151873 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dngb2_7d49b78a-27ae-4f41-a759-29b898bf6fe1/network-metrics-daemon/0.log" Apr 22 15:16:51.178544 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:51.178513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dngb2_7d49b78a-27ae-4f41-a759-29b898bf6fe1/kube-rbac-proxy/0.log" Apr 22 15:16:52.115929 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.115902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/ovn-controller/0.log" Apr 22 15:16:52.157766 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.157737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/ovn-acl-logging/0.log" Apr 22 15:16:52.191134 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.191103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/kube-rbac-proxy-node/0.log" Apr 22 15:16:52.220838 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.220799 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:16:52.245784 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.245757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/northd/0.log" Apr 22 15:16:52.273275 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.273250 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/nbdb/0.log" Apr 22 15:16:52.297648 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.297620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/sbdb/0.log" Apr 22 15:16:52.399760 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:52.399688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbtw8_baba1b59-01cc-4a9a-8350-a118e41a4e8b/ovnkube-controller/0.log" Apr 22 15:16:54.177582 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:54.177556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xj22k_0f845cc3-634e-4134-8f72-6e6eb367d773/network-check-target-container/0.log" Apr 22 15:16:55.198551 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:55.198511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jrk58_2f743df8-0701-4855-a2fa-4b71d8a6efc9/iptables-alerter/0.log" Apr 22 15:16:55.917781 ip-10-0-139-83 kubenswrapper[2576]: I0422 15:16:55.917751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zflp5_ace2c57e-f44e-4d5d-b3fc-b036816a748d/tuned/0.log"