Apr 16 08:33:10.840333 ip-10-0-128-115 systemd[1]: Starting Kubernetes Kubelet... Apr 16 08:33:11.271977 ip-10-0-128-115 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:11.271977 ip-10-0-128-115 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 08:33:11.271977 ip-10-0-128-115 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:11.271977 ip-10-0-128-115 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 08:33:11.271977 ip-10-0-128-115 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:11.273195 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.273118 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 08:33:11.277983 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277968 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:11.277983 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277984 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277988 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277991 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277994 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.277997 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278000 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278003 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278005 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278008 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278012 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278014 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278039 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278043 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278045 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278048 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278051 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278053 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278057 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278060 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278062 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:11.278066 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278068 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278072 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278075 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278077 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278081 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278084 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278086 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278089 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278092 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278094 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278097 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278099 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278103 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278107 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278110 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278113 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278115 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278118 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278120 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:11.278546 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278124 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278127 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278132 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278134 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278137 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278139 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278141 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278144 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278146 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278149 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278152 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278154 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278157 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278160 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278163 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278166 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278168 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278171 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278174 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:11.279013 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278176 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278179 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278181 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278184 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278186 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278189 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278192 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278201 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278204 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278206 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278209 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278211 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278214 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278216 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278219 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278228 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278231 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278234 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278236 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278239 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:11.279516 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278243 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278245 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278250 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278253 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278256 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278259 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278689 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278695 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278699 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278702 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278705 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278708 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278711 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278714 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278718 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278721 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278723 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278726 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278728 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:11.279997 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278731 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278733 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278736 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278739 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278742 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278744 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278746 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278749 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278760 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278764 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278767 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278770 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278772 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278775 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278778 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278780 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278783 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278785 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278788 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:11.280493 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278790 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278795 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278797 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278800 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278803 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278805 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278808 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278811 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278813 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278815 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278818 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278820 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278823 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278825 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278828 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278831 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278833 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278835 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278838 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278840 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:11.280991 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278848 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278851 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278859 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278861 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278864 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278867 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278869 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278872 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278874 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278877 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278880 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278882 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278885 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278888 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278891 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278893 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278896 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278898 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278901 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278904 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:11.281491 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278907 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278910 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278912 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278914 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278917 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278919 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278922 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278924 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278927 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278929 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278932 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278934 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278937 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.278939 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280296 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280313 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280322 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280327 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280331 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280334 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280339 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 08:33:11.281966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280343 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280347 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280350 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280354 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280358 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280361 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280364 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280367 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280370 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280373 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280375 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280378 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280385 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280388 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280391 2574 flags.go:64] FLAG: --config-dir="" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280393 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280397 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280401 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280404 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280407 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280410 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280413 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280416 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280419 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280422 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 08:33:11.282486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280425 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280429 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280439 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280442 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280445 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280448 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280450 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280457 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280460 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280464 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280467 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280470 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280474 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280477 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280480 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280483 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280486 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280489 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280492 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280495 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280498 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280501 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280504 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280508 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280511 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 08:33:11.283097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280514 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280517 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280520 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280523 2574 flags.go:64] FLAG: --help="false" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280526 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280529 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280532 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280535 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280538 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280541 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280550 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280553 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280556 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280559 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280562 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280565 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280567 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280570 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280573 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280576 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280579 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280582 2574 flags.go:64] FLAG: --lock-file="" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280585 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280587 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 08:33:11.283704 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280591 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280596 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280599 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280602 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280604 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280608 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280611 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280614 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280617 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280621 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280624 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280628 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280631 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280634 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280637 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280640 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280643 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280645 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280648 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280657 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280660 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280663 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280666 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 08:33:11.284293 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280669 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280675 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280678 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280681 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280684 2574 flags.go:64] FLAG: --port="10250" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280687 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280690 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b9f164dae3cf97be" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280693 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280696 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280699 2574 flags.go:64] FLAG: --register-node="true" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280702 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280705 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280709 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280711 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280714 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280717 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280720 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280723 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280727 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280729 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280732 2574 flags.go:64] FLAG: --runonce="false" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280735 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280738 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280741 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280744 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280747 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 08:33:11.284834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280750 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280753 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280756 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280759 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280762 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280765 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280767 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280770 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280773 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280777 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280782 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280785 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280788 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280794 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280797 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280800 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280802 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280805 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280808 2574 flags.go:64] FLAG: --v="2" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280813 2574 flags.go:64] FLAG: --version="false" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280817 2574 flags.go:64] FLAG: --vmodule="" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280821 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.280824 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280932 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:11.285474 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280935 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280938 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280941 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280943 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280946 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280948 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280951 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280953 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280956 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280958 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280961 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280963 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280966 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280969 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280971 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280974 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280976 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280979 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280982 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280985 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280987 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:11.286093 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280990 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280992 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280995 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.280998 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281003 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281007 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281011 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281015 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281033 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281037 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281041 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281044 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281047 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281050 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281053 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281055 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281058 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281061 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281064 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:11.286645 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281068 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281071 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281074 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281077 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281080 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281087 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281090 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281093 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281096 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281098 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281101 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281104 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281106 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281109 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281111 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281114 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281117 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281122 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281124 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:11.287158 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281127 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281129 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281132 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281134 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281137 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281139 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281142 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281144 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281147 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281150 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281152 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281155 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281157 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281160 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281162 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281165 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281168 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281170 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281173 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281176 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:11.287633 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281178 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281181 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281184 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281186 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281189 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.281191 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:11.288149 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.282037 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.288202 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.288217 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288277 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288283 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288286 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288289 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288292 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288295 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288299 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288302 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288305 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:11.288303 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288308 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288311 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288314 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288317 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288320 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288323 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288326 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288329 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288333 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288336 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288339 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288341 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288344 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288346 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288349 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288351 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288353 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288356 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288358 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288361 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:11.288590 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288364 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288366 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288369 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288372 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288375 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288377 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288380 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288382 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288385 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288388 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288390 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288393 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288395 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288398 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288402 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288404 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288407 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288409 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288412 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288415 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:11.289090 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288417 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288420 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288422 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288425 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288428 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288430 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288433 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288436 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288439 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288441 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288444 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288446 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288449 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288451 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288454 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288457 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288461 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288465 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288468 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:11.289583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288472 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288476 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288479 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288482 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288485 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288487 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288490 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288494 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288496 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288499 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288502 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288504 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288507 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288509 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288512 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288514 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288517 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:11.290141 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288519 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.288524 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288631 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288636 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288639 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288642 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288645 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288648 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288651 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288654 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288656 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288659 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288667 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288670 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288673 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:11.290549 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288677 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288681 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288684 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288686 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288689 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288691 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288694 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288697 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288700 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288702 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288705 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288708 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288710 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288713 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288715 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288718 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288721 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288723 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288727 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:11.290993 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288730 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288733 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288735 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288738 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288740 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288743 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288745 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288748 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288751 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288753 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288756 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288764 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288766 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288769 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288771 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288774 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288776 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288778 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288781 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288783 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:11.291494 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288787 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288790 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288792 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288795 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288797 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288800 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288802 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288805 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288807 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288810 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288812 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288815 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288817 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288820 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288822 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288825 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288827 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288830 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288833 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288836 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:11.291971 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288838 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288841 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288843 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288846 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288854 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288856 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288859 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288862 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288864 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288867 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288869 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288872 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288875 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:11.288877 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.288882 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:11.292476 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.289644 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 08:33:11.293981 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.293967 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 08:33:11.294861 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.294848 2574 server.go:1019] "Starting client certificate rotation" Apr 16 08:33:11.294959 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.294944 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:33:11.294995 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.294987 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:33:11.321702 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.321688 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:33:11.324253 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.324226 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:33:11.338767 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.338743 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 08:33:11.344624 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.344608 2574 log.go:25] "Validated CRI v1 image API" Apr 16 08:33:11.347571 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.347556 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 08:33:11.351308 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.351282 2574 fs.go:135] Filesystem UUIDs: map[218ebc29-7ea9-4033-935b-c97977615318:/dev/nvme0n1p3 462cbac3-c6ce-45ef-b0d7-ef7dc1a658b1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 08:33:11.351308 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.351302 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 08:33:11.356158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.356039 2574 manager.go:217] Machine: {Timestamp:2026-04-16 08:33:11.354968802 +0000 UTC m=+0.403545940 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099719 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23d489d205c9ad5f281b64fa5bedec SystemUUID:ec23d489-d205-c9ad-5f28-1b64fa5bedec BootID:d1b98e61-9b1e-448d-8082-aaeeec010e0a Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:40:07:00:8e:71 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:40:07:00:8e:71 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:de:f1:b9:ba:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 08:33:11.356618 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.356599 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:33:11.356689 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.356638 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 08:33:11.356744 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.356729 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 08:33:11.357753 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357730 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 08:33:11.357883 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357755 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-115.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 08:33:11.357934 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357892 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 08:33:11.357934 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357899 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 08:33:11.357934 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357916 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:33:11.357934 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.357932 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:33:11.359480 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.359467 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:33:11.359585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.359576 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 08:33:11.361986 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.361976 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 08:33:11.362058 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.361990 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 08:33:11.362558 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.362549 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 08:33:11.362587 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.362561 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 08:33:11.362587 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.362570 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 08:33:11.363518 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.363506 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:33:11.363563 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.363523 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:33:11.366068 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.366052 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 08:33:11.367778 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.367764 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 08:33:11.369282 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369269 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369286 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369293 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369298 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369304 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369312 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 08:33:11.369337 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369335 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 08:33:11.369505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369343 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 08:33:11.369505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369349 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 08:33:11.369505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369354 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 08:33:11.369505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369371 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 08:33:11.369505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.369380 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 08:33:11.370441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.370424 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 08:33:11.370441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.370434 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 08:33:11.372668 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.372607 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-115.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 08:33:11.373422 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.373396 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 08:33:11.373499 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.373401 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 08:33:11.374067 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.374055 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 08:33:11.374135 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.374090 2574 server.go:1295] "Started kubelet" Apr 16 08:33:11.374190 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.374165 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 08:33:11.374255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.374220 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 08:33:11.374291 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.374271 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 08:33:11.374883 ip-10-0-128-115 systemd[1]: Started Kubernetes Kubelet. Apr 16 08:33:11.377484 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.377460 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 08:33:11.380533 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.380518 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 08:33:11.381679 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.380753 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-115.ec2.internal.18a6c942b44ad16a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-115.ec2.internal,UID:ip-10-0-128-115.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-115.ec2.internal,},FirstTimestamp:2026-04-16 08:33:11.374066026 +0000 UTC m=+0.422643148,LastTimestamp:2026-04-16 08:33:11.374066026 +0000 UTC m=+0.422643148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-115.ec2.internal,}" Apr 16 08:33:11.382077 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.382056 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 08:33:11.382601 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.382586 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 08:33:11.383348 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383248 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 08:33:11.383460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383408 2574 factory.go:55] Registering systemd factory Apr 16 08:33:11.383460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383424 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 08:33:11.383680 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.383477 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.383680 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383563 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 08:33:11.383680 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383577 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 08:33:11.383680 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383656 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 08:33:11.383680 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383665 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 08:33:11.383991 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383712 2574 factory.go:153] Registering CRI-O factory Apr 16 08:33:11.383991 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383720 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 08:33:11.383991 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383771 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 08:33:11.383991 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383794 2574 factory.go:103] Registering Raw factory Apr 16 08:33:11.383991 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.383808 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 08:33:11.384447 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.384385 2574 manager.go:319] Starting recovery of all containers Apr 16 08:33:11.384447 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.384409 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wl2c4" Apr 16 08:33:11.384583 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.384542 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 08:33:11.391343 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.391317 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wl2c4" Apr 16 08:33:11.393594 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.393434 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 08:33:11.393594 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.393454 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-115.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 08:33:11.394177 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.394073 2574 manager.go:324] Recovery completed Apr 16 08:33:11.397915 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.397903 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.400218 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400202 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.400282 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400231 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.400282 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400243 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.400738 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400718 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 08:33:11.400738 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400733 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 08:33:11.400738 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.400747 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:33:11.402792 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.402771 2574 policy_none.go:49] "None policy: Start" Apr 16 08:33:11.402792 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.402786 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 08:33:11.402904 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.402796 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 08:33:11.435660 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.435645 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.435687 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.435701 2574 server.go:85] "Starting device plugin registration server" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.435985 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.435997 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.436135 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.436213 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.436219 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.436684 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 08:33:11.452473 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.436716 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.532433 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.532376 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 08:33:11.533572 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.533556 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 08:33:11.534237 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.534224 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 08:33:11.534334 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.534247 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 08:33:11.534334 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.534255 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 08:33:11.534334 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.534292 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 08:33:11.536107 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.536087 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.536837 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.536821 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.536907 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.536848 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.536907 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.536885 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.536907 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.536908 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.537696 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.537682 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:11.544847 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.544832 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.544933 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.544855 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-115.ec2.internal\": node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.558602 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.558584 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.635086 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.635055 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal"] Apr 16 08:33:11.635171 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.635134 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.635873 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.635860 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.635937 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.635885 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.635937 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.635899 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.637141 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637129 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.637294 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.637341 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637305 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.637776 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637755 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.637776 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637769 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.637776 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637779 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.637942 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637793 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.637942 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637793 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.637942 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.637869 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.639573 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.639557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.639620 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.639592 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:11.640208 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.640194 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:11.640283 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.640213 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:11.640283 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.640222 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:11.659106 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.659086 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.663591 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.663573 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-115.ec2.internal\" not found" node="ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.667888 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.667872 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-115.ec2.internal\" not found" node="ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.686415 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.686396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.686490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.686419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c436ee0b974ec434e2858234467270-config\") pod \"kube-apiserver-proxy-ip-10-0-128-115.ec2.internal\" (UID: \"69c436ee0b974ec434e2858234467270\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.686490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.686437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.759387 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.759359 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.786763 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c436ee0b974ec434e2858234467270-config\") pod \"kube-apiserver-proxy-ip-10-0-128-115.ec2.internal\" (UID: \"69c436ee0b974ec434e2858234467270\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.786763 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.786887 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.786887 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.786887 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b46c00968a1a056bfff21f51e35fabd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal\" (UID: \"b46c00968a1a056bfff21f51e35fabd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.786887 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.786821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c436ee0b974ec434e2858234467270-config\") pod \"kube-apiserver-proxy-ip-10-0-128-115.ec2.internal\" (UID: \"69c436ee0b974ec434e2858234467270\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.859908 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.859882 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.960589 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:11.960560 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:11.966723 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.966709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:11.970945 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:11.970929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:12.061730 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:12.061649 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:12.162159 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:12.162122 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:12.262639 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:12.262615 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:12.295094 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.295075 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 08:33:12.295467 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.295211 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:33:12.363423 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:12.363363 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-115.ec2.internal\" not found" Apr 16 08:33:12.377564 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.377546 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:12.382182 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.382166 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 08:33:12.382862 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.382846 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" Apr 16 08:33:12.393747 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.393721 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 08:28:11 +0000 UTC" deadline="2027-09-12 02:06:33.607873897 +0000 UTC" Apr 16 08:33:12.393747 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.393742 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12329h33m21.214134747s" Apr 16 08:33:12.394240 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.394226 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:33:12.395899 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.395885 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" Apr 16 08:33:12.397601 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.397586 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:33:12.404587 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.404574 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:33:12.419436 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.419419 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2n7k2" Apr 16 08:33:12.425947 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.425932 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2n7k2" Apr 16 08:33:12.549512 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:12.549483 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c436ee0b974ec434e2858234467270.slice/crio-bf531cb748f9c4a3468526a661557ce14672457981dd73111bebe81f0f1928fb WatchSource:0}: Error finding container bf531cb748f9c4a3468526a661557ce14672457981dd73111bebe81f0f1928fb: Status 404 returned error can't find the container with id bf531cb748f9c4a3468526a661557ce14672457981dd73111bebe81f0f1928fb Apr 16 08:33:12.550640 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:12.550617 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb46c00968a1a056bfff21f51e35fabd4.slice/crio-252395c90c4cec14edc3422675bacfec05023fa4c5a1b027873ae072a395b3d0 WatchSource:0}: Error finding container 252395c90c4cec14edc3422675bacfec05023fa4c5a1b027873ae072a395b3d0: Status 404 returned error can't find the container with id 252395c90c4cec14edc3422675bacfec05023fa4c5a1b027873ae072a395b3d0 Apr 16 08:33:12.555364 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.555350 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:33:12.584279 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.584259 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:12.898592 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:12.898570 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:13.364449 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.364369 2574 apiserver.go:52] "Watching apiserver" Apr 16 08:33:13.373041 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.373007 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 08:33:13.374977 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.374947 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw","openshift-cluster-node-tuning-operator/tuned-6rqnd","openshift-image-registry/node-ca-mvrgd","openshift-network-diagnostics/network-check-target-pq6xw","openshift-network-operator/iptables-alerter-2xl8x","openshift-dns/node-resolver-wqhp5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal","openshift-multus/multus-additional-cni-plugins-87mwn","openshift-multus/multus-b8fml","openshift-multus/network-metrics-daemon-r9fn4","openshift-ovn-kubernetes/ovnkube-node-f6wjp","kube-system/konnectivity-agent-rllmj"] Apr 16 08:33:13.376527 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.376503 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.377782 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.377757 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.378615 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.378591 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.378698 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.378632 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j754g\"" Apr 16 08:33:13.378698 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.378678 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.379059 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.379041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.380244 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.380224 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 08:33:13.380360 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.380281 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.380360 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.380312 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j6zpl\"" Apr 16 08:33:13.380533 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.380512 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.381201 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.381180 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.381460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.381441 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.381551 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.381540 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-d6lsm\"" Apr 16 08:33:13.381637 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.381622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.382788 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.382772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:13.382887 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.382837 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:13.383790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.383772 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.383910 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.383804 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.383910 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.383866 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 08:33:13.384422 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.384405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.385411 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.384941 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dffzl\"" Apr 16 08:33:13.385824 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.385771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.386121 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.386102 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.386867 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.386667 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.386867 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.386710 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.386867 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.386785 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jk5tm\"" Apr 16 08:33:13.386867 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.386711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 08:33:13.387131 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.387084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.387184 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.387152 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:13.388384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.387886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.388384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.387997 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 08:33:13.388384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388198 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bddg\"" Apr 16 08:33:13.388384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388262 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 08:33:13.388384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388273 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 08:33:13.388658 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 08:33:13.388658 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388483 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qtgdf\"" Apr 16 08:33:13.388658 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.388490 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.389943 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.389922 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.390101 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.390082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.392155 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.392135 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 08:33:13.393096 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393077 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q4ckz\"" Apr 16 08:33:13.393186 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393156 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 08:33:13.393256 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393215 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 08:33:13.393313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393289 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 08:33:13.393368 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393317 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sccxv\"" Apr 16 08:33:13.393368 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393335 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 08:33:13.393462 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393445 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 08:33:13.393701 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393681 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 08:33:13.393780 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.393751 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 08:33:13.394181 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6qdp\" (UniqueName: \"kubernetes.io/projected/eacd4fff-e409-4534-945c-507d909b8258-kube-api-access-b6qdp\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.394268 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-os-release\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.394268 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.394268 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-socket-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-registration-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-lib-modules\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp6d\" (UniqueName: \"kubernetes.io/projected/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-kube-api-access-vhp6d\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flxv\" (UniqueName: \"kubernetes.io/projected/994019bc-fe5d-4c20-abc0-f589b27a59ca-kube-api-access-9flxv\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394363 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-device-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.394391 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-conf\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-host-slash\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.394585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-system-cni-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.394585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-etc-selinux\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.394585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-kubernetes\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394712 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-tuned\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394712 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-tmp\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394712 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99c69840-b6dc-47aa-a435-9c9a49111d84-hosts-file\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.394712 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eacd4fff-e409-4534-945c-507d909b8258-serviceca\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-run\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-sys\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kw66\" (UniqueName: \"kubernetes.io/projected/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-kube-api-access-2kw66\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thcq4\" (UniqueName: \"kubernetes.io/projected/99c69840-b6dc-47aa-a435-9c9a49111d84-kube-api-access-thcq4\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eacd4fff-e409-4534-945c-507d909b8258-host\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:13.394917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-cnibin\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjs5\" (UniqueName: \"kubernetes.io/projected/93104adf-798d-4000-97f1-77b9402f3a86-kube-api-access-5fjs5\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.394991 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-iptables-alerter-script\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-modprobe-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-var-lib-kubelet\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-sys-fs\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysconfig\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-host\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99c69840-b6dc-47aa-a435-9c9a49111d84-tmp-dir\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.395271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.395809 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.395280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-systemd\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.427791 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.427568 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:28:12 +0000 UTC" deadline="2027-11-06 06:25:37.055373363 +0000 UTC" Apr 16 08:33:13.427791 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.427592 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13653h52m23.627783561s" Apr 16 08:33:13.484407 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.484390 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 08:33:13.495605 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-kubelet\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.495605 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-daemon-config\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6qdp\" (UniqueName: \"kubernetes.io/projected/eacd4fff-e409-4534-945c-507d909b8258-kube-api-access-b6qdp\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-lib-modules\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp6d\" (UniqueName: \"kubernetes.io/projected/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-kube-api-access-vhp6d\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.495766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-etc-kubernetes\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-lib-modules\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495872 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-slash\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-netns\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-var-lib-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.495990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-system-cni-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.496069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-system-cni-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-etc-selinux\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-tmp\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-kubelet\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-systemd-units\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-etc-selinux\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-config\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6cfd862-350d-4749-a71a-3363dc8bbfa0-agent-certs\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99c69840-b6dc-47aa-a435-9c9a49111d84-hosts-file\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eacd4fff-e409-4534-945c-507d909b8258-serviceca\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-sys\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/99c69840-b6dc-47aa-a435-9c9a49111d84-hosts-file\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.496401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kw66\" (UniqueName: \"kubernetes.io/projected/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-kube-api-access-2kw66\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496399 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-iptables-alerter-script\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-system-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-sys\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-multus\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thcq4\" (UniqueName: \"kubernetes.io/projected/99c69840-b6dc-47aa-a435-9c9a49111d84-kube-api-access-thcq4\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjs5\" (UniqueName: \"kubernetes.io/projected/93104adf-798d-4000-97f1-77b9402f3a86-kube-api-access-5fjs5\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-hostroot\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rx5t\" (UniqueName: \"kubernetes.io/projected/b1290b06-222c-45ae-985a-c88370488114-kube-api-access-5rx5t\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eacd4fff-e409-4534-945c-507d909b8258-serviceca\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-modprobe-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cnibin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.496911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-modprobe-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.496989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-host\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-netns\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-iptables-alerter-script\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89wdh\" (UniqueName: \"kubernetes.io/projected/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-kube-api-access-89wdh\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-host\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497108 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-ovn\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-netd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-os-release\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-socket-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-registration-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-node-log\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497325 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-os-release\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-script-lib\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.497560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6cfd862-350d-4749-a71a-3363dc8bbfa0-konnectivity-ca\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9flxv\" (UniqueName: \"kubernetes.io/projected/994019bc-fe5d-4c20-abc0-f589b27a59ca-kube-api-access-9flxv\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-registration-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-socket-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-device-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-d\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-conf\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-device-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-host-slash\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-os-release\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-host-slash\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-etc-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysctl-conf\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82x8g\" (UniqueName: \"kubernetes.io/projected/d6bc0f25-3003-4855-b122-6d1820717354-kube-api-access-82x8g\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-kubernetes\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-tuned\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6bc0f25-3003-4855-b122-6d1820717354-ovn-node-metrics-cert\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.498313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-kubernetes\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-run\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cni-binary-copy\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-socket-dir-parent\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497991 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-bin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.497994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-run\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-env-overrides\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eacd4fff-e409-4534-945c-507d909b8258-host\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-cnibin\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eacd4fff-e409-4534-945c-507d909b8258-host\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-cnibin\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-k8s-cni-cncf-io\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498275 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-var-lib-kubelet\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-conf-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-multus-certs\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-systemd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-log-socket\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-bin\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-sys-fs\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysconfig\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/994019bc-fe5d-4c20-abc0-f589b27a59ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99c69840-b6dc-47aa-a435-9c9a49111d84-tmp-dir\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-systemd\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-systemd\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498693 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-var-lib-kubelet\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498739 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99c69840-b6dc-47aa-a435-9c9a49111d84-tmp-dir\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93104adf-798d-4000-97f1-77b9402f3a86-sys-fs\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.498752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-sysconfig\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.499790 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.499156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/994019bc-fe5d-4c20-abc0-f589b27a59ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.500462 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.500135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-tmp\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.500462 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.500160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-etc-tuned\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.505520 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.505218 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:13.505520 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.505243 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:13.505520 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.505256 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:13.505520 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.505366 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:14.005306963 +0000 UTC m=+3.053884084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:13.507804 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.507765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6qdp\" (UniqueName: \"kubernetes.io/projected/eacd4fff-e409-4534-945c-507d909b8258-kube-api-access-b6qdp\") pod \"node-ca-mvrgd\" (UID: \"eacd4fff-e409-4534-945c-507d909b8258\") " pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.508318 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.508286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjs5\" (UniqueName: \"kubernetes.io/projected/93104adf-798d-4000-97f1-77b9402f3a86-kube-api-access-5fjs5\") pod \"aws-ebs-csi-driver-node-qfzqw\" (UID: \"93104adf-798d-4000-97f1-77b9402f3a86\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.508318 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.508299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thcq4\" (UniqueName: \"kubernetes.io/projected/99c69840-b6dc-47aa-a435-9c9a49111d84-kube-api-access-thcq4\") pod \"node-resolver-wqhp5\" (UID: \"99c69840-b6dc-47aa-a435-9c9a49111d84\") " pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.508318 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.508293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flxv\" (UniqueName: \"kubernetes.io/projected/994019bc-fe5d-4c20-abc0-f589b27a59ca-kube-api-access-9flxv\") pod \"multus-additional-cni-plugins-87mwn\" (UID: \"994019bc-fe5d-4c20-abc0-f589b27a59ca\") " pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.509557 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.509535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp6d\" (UniqueName: \"kubernetes.io/projected/7f8ea69a-dea8-4b99-b06b-d678bbe4c26e-kube-api-access-vhp6d\") pod \"iptables-alerter-2xl8x\" (UID: \"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e\") " pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.510698 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.510599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kw66\" (UniqueName: \"kubernetes.io/projected/ec8a52eb-3122-4a62-bca8-7bf7966c67e7-kube-api-access-2kw66\") pod \"tuned-6rqnd\" (UID: \"ec8a52eb-3122-4a62-bca8-7bf7966c67e7\") " pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.539843 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.539795 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" event={"ID":"b46c00968a1a056bfff21f51e35fabd4","Type":"ContainerStarted","Data":"252395c90c4cec14edc3422675bacfec05023fa4c5a1b027873ae072a395b3d0"} Apr 16 08:33:13.540671 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.540642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" event={"ID":"69c436ee0b974ec434e2858234467270","Type":"ContainerStarted","Data":"bf531cb748f9c4a3468526a661557ce14672457981dd73111bebe81f0f1928fb"} Apr 16 08:33:13.599529 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cni-binary-copy\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599535 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-socket-dir-parent\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-bin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-env-overrides\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-bin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-k8s-cni-cncf-io\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-socket-dir-parent\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-conf-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.599676 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-multus-certs\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-k8s-cni-cncf-io\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-systemd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-conf-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-log-socket\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-log-socket\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-multus-certs\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-systemd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-bin\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-bin\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-kubelet\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-daemon-config\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-kubelet\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.599991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-etc-kubernetes\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-slash\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-netns\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-etc-kubernetes\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-var-lib-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-kubelet\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-var-lib-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-systemd-units\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-config\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-env-overrides\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6cfd862-350d-4749-a71a-3363dc8bbfa0-agent-certs\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-netns\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.600258 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-slash\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-system-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:13.600308 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:14.100295599 +0000 UTC m=+3.148872710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-kubelet\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cni-binary-copy\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-system-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-daemon-config\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.600929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-systemd-units\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-multus\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-hostroot\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rx5t\" (UniqueName: \"kubernetes.io/projected/b1290b06-222c-45ae-985a-c88370488114-kube-api-access-5rx5t\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cnibin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-multus-cni-dir\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-netns\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89wdh\" (UniqueName: \"kubernetes.io/projected/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-kube-api-access-89wdh\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-ovn\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-hostroot\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-netd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-node-log\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-script-lib\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6cfd862-350d-4749-a71a-3363dc8bbfa0-konnectivity-ca\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-run-ovn-kubernetes\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.601766 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-os-release\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-etc-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-ovn\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-var-lib-cni-multus\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82x8g\" (UniqueName: \"kubernetes.io/projected/d6bc0f25-3003-4855-b122-6d1820717354-kube-api-access-82x8g\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6bc0f25-3003-4855-b122-6d1820717354-ovn-node-metrics-cert\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-node-log\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-host-run-netns\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-cnibin\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.600795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-run-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-host-cni-netd\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6bc0f25-3003-4855-b122-6d1820717354-etc-openvswitch\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-os-release\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-script-lib\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6bc0f25-3003-4855-b122-6d1820717354-ovnkube-config\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.602614 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.601695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6cfd862-350d-4749-a71a-3363dc8bbfa0-konnectivity-ca\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.603364 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.602828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6cfd862-350d-4749-a71a-3363dc8bbfa0-agent-certs\") pod \"konnectivity-agent-rllmj\" (UID: \"d6cfd862-350d-4749-a71a-3363dc8bbfa0\") " pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.603585 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.603567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6bc0f25-3003-4855-b122-6d1820717354-ovn-node-metrics-cert\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.609589 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.609568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89wdh\" (UniqueName: \"kubernetes.io/projected/e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb-kube-api-access-89wdh\") pod \"multus-b8fml\" (UID: \"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb\") " pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.610062 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.610044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rx5t\" (UniqueName: \"kubernetes.io/projected/b1290b06-222c-45ae-985a-c88370488114-kube-api-access-5rx5t\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:13.610905 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.610875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x8g\" (UniqueName: \"kubernetes.io/projected/d6bc0f25-3003-4855-b122-6d1820717354-kube-api-access-82x8g\") pod \"ovnkube-node-f6wjp\" (UID: \"d6bc0f25-3003-4855-b122-6d1820717354\") " pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.689338 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.689259 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wqhp5" Apr 16 08:33:13.699615 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.699594 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" Apr 16 08:33:13.709319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.709299 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" Apr 16 08:33:13.716804 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.716789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mvrgd" Apr 16 08:33:13.725317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.725298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2xl8x" Apr 16 08:33:13.731860 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.731846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87mwn" Apr 16 08:33:13.744667 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.744589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b8fml" Apr 16 08:33:13.752140 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.752119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:13.757797 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.757781 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:13.884988 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:13.884958 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:14.005542 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.005464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:14.005691 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.005608 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:14.005691 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.005627 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:14.005691 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.005638 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:14.005817 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.005696 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:15.005677817 +0000 UTC m=+4.054254931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:14.065424 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.065385 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93104adf_798d_4000_97f1_77b9402f3a86.slice/crio-92547d83c241b05c142989cf46a98fdfe0965995796df5167774a0814ead8aac WatchSource:0}: Error finding container 92547d83c241b05c142989cf46a98fdfe0965995796df5167774a0814ead8aac: Status 404 returned error can't find the container with id 92547d83c241b05c142989cf46a98fdfe0965995796df5167774a0814ead8aac Apr 16 08:33:14.066414 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.066387 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c69840_b6dc_47aa_a435_9c9a49111d84.slice/crio-fb0a81d25b524a867da9fe204b80f8a360f4959ffab2d4d75e45cbea3890225d WatchSource:0}: Error finding container fb0a81d25b524a867da9fe204b80f8a360f4959ffab2d4d75e45cbea3890225d: Status 404 returned error can't find the container with id fb0a81d25b524a867da9fe204b80f8a360f4959ffab2d4d75e45cbea3890225d Apr 16 08:33:14.071049 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.070994 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994019bc_fe5d_4c20_abc0_f589b27a59ca.slice/crio-d90ef02c312fdbd0778675912f265fd68efa0543b37d059905dd32735b545cb0 WatchSource:0}: Error finding container d90ef02c312fdbd0778675912f265fd68efa0543b37d059905dd32735b545cb0: Status 404 returned error can't find the container with id d90ef02c312fdbd0778675912f265fd68efa0543b37d059905dd32735b545cb0 Apr 16 08:33:14.072976 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.072113 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8a52eb_3122_4a62_bca8_7bf7966c67e7.slice/crio-8f6aa7bc5be9d443d01447a97b791b532579ca263f8249f238d18b4ec9a67578 WatchSource:0}: Error finding container 8f6aa7bc5be9d443d01447a97b791b532579ca263f8249f238d18b4ec9a67578: Status 404 returned error can't find the container with id 8f6aa7bc5be9d443d01447a97b791b532579ca263f8249f238d18b4ec9a67578 Apr 16 08:33:14.075177 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.075152 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeacd4fff_e409_4534_945c_507d909b8258.slice/crio-28f18163a009a86719666e37f95de8c884d8b60d84bcc8c05e3f28b7ca29de1e WatchSource:0}: Error finding container 28f18163a009a86719666e37f95de8c884d8b60d84bcc8c05e3f28b7ca29de1e: Status 404 returned error can't find the container with id 28f18163a009a86719666e37f95de8c884d8b60d84bcc8c05e3f28b7ca29de1e Apr 16 08:33:14.075427 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.075405 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f1bf71_c497_4ba1_8e98_12b8dcdc7dcb.slice/crio-e8c232863932b6d4e4e600a2e563ea98a80b5d1a62c87ff8dff4256cc618dc86 WatchSource:0}: Error finding container e8c232863932b6d4e4e600a2e563ea98a80b5d1a62c87ff8dff4256cc618dc86: Status 404 returned error can't find the container with id e8c232863932b6d4e4e600a2e563ea98a80b5d1a62c87ff8dff4256cc618dc86 Apr 16 08:33:14.077469 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.076964 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bc0f25_3003_4855_b122_6d1820717354.slice/crio-76c4b166fc9f28628c129b8ac0c5529de1678b2194b6ff4b2579e51787bba937 WatchSource:0}: Error finding container 76c4b166fc9f28628c129b8ac0c5529de1678b2194b6ff4b2579e51787bba937: Status 404 returned error can't find the container with id 76c4b166fc9f28628c129b8ac0c5529de1678b2194b6ff4b2579e51787bba937 Apr 16 08:33:14.077654 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.077639 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8ea69a_dea8_4b99_b06b_d678bbe4c26e.slice/crio-ca8397e29fd40a6d92e357a39c1d917df02c14d9c860761a31d4de23639b85c7 WatchSource:0}: Error finding container ca8397e29fd40a6d92e357a39c1d917df02c14d9c860761a31d4de23639b85c7: Status 404 returned error can't find the container with id ca8397e29fd40a6d92e357a39c1d917df02c14d9c860761a31d4de23639b85c7 Apr 16 08:33:14.078246 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:14.078216 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cfd862_350d_4749_a71a_3363dc8bbfa0.slice/crio-640b54b09e4ab87e2be0c8db79dcd46c464f1e439f56963928cd0ac8bb949741 WatchSource:0}: Error finding container 640b54b09e4ab87e2be0c8db79dcd46c464f1e439f56963928cd0ac8bb949741: Status 404 returned error can't find the container with id 640b54b09e4ab87e2be0c8db79dcd46c464f1e439f56963928cd0ac8bb949741 Apr 16 08:33:14.106553 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.106533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:14.106659 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.106647 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:14.106708 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:14.106699 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:15.10668661 +0000 UTC m=+4.155263717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:14.428560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.428292 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:28:12 +0000 UTC" deadline="2027-10-03 13:15:15.698863114 +0000 UTC" Apr 16 08:33:14.428560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.428496 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12844h42m1.270372745s" Apr 16 08:33:14.544586 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.544550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" event={"ID":"69c436ee0b974ec434e2858234467270","Type":"ContainerStarted","Data":"d4a429d6546abd85420f8d7c1528a673b287a157eacc12947aa1fdf804ab4eff"} Apr 16 08:33:14.549272 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.549239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rllmj" event={"ID":"d6cfd862-350d-4749-a71a-3363dc8bbfa0","Type":"ContainerStarted","Data":"640b54b09e4ab87e2be0c8db79dcd46c464f1e439f56963928cd0ac8bb949741"} Apr 16 08:33:14.551198 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.551153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xl8x" event={"ID":"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e","Type":"ContainerStarted","Data":"ca8397e29fd40a6d92e357a39c1d917df02c14d9c860761a31d4de23639b85c7"} Apr 16 08:33:14.553688 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.553641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"76c4b166fc9f28628c129b8ac0c5529de1678b2194b6ff4b2579e51787bba937"} Apr 16 08:33:14.555112 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.555066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b8fml" event={"ID":"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb","Type":"ContainerStarted","Data":"e8c232863932b6d4e4e600a2e563ea98a80b5d1a62c87ff8dff4256cc618dc86"} Apr 16 08:33:14.556399 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.556378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wqhp5" event={"ID":"99c69840-b6dc-47aa-a435-9c9a49111d84","Type":"ContainerStarted","Data":"fb0a81d25b524a867da9fe204b80f8a360f4959ffab2d4d75e45cbea3890225d"} Apr 16 08:33:14.558382 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.558347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" event={"ID":"93104adf-798d-4000-97f1-77b9402f3a86","Type":"ContainerStarted","Data":"92547d83c241b05c142989cf46a98fdfe0965995796df5167774a0814ead8aac"} Apr 16 08:33:14.560510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.560467 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-115.ec2.internal" podStartSLOduration=2.56045261 podStartE2EDuration="2.56045261s" podCreationTimestamp="2026-04-16 08:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:14.560285082 +0000 UTC m=+3.608862211" watchObservedRunningTime="2026-04-16 08:33:14.56045261 +0000 UTC m=+3.609029742" Apr 16 08:33:14.562983 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.562962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mvrgd" event={"ID":"eacd4fff-e409-4534-945c-507d909b8258","Type":"ContainerStarted","Data":"28f18163a009a86719666e37f95de8c884d8b60d84bcc8c05e3f28b7ca29de1e"} Apr 16 08:33:14.567234 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.567211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" event={"ID":"ec8a52eb-3122-4a62-bca8-7bf7966c67e7","Type":"ContainerStarted","Data":"8f6aa7bc5be9d443d01447a97b791b532579ca263f8249f238d18b4ec9a67578"} Apr 16 08:33:14.569142 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:14.569120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerStarted","Data":"d90ef02c312fdbd0778675912f265fd68efa0543b37d059905dd32735b545cb0"} Apr 16 08:33:15.016608 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.015898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:15.016608 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.016102 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:15.016608 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.016124 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:15.016608 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.016137 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:15.016608 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.016192 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:17.016173901 +0000 UTC m=+6.064751012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:15.117148 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.116620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:15.117148 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.116754 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:15.117148 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.116814 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:17.116796742 +0000 UTC m=+6.165373856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:15.537099 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.535289 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:15.537099 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.535418 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:15.537099 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.535822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:15.537099 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:15.535922 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:15.584442 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.583387 2574 generic.go:358] "Generic (PLEG): container finished" podID="b46c00968a1a056bfff21f51e35fabd4" containerID="51506a2b5c3af049ad47e131e2dd6be5088f3a515df7cfb3c54ffd273a024efd" exitCode=0 Apr 16 08:33:15.584442 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:15.584259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" event={"ID":"b46c00968a1a056bfff21f51e35fabd4","Type":"ContainerDied","Data":"51506a2b5c3af049ad47e131e2dd6be5088f3a515df7cfb3c54ffd273a024efd"} Apr 16 08:33:16.603269 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:16.602591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" event={"ID":"b46c00968a1a056bfff21f51e35fabd4","Type":"ContainerStarted","Data":"80dcc19ba33b4ca9e7c92eae4fa241f9350ca6630aedfec303c889dbcb89f394"} Apr 16 08:33:17.033441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:17.033360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:17.033602 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.033511 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:17.033602 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.033529 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:17.033602 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.033541 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:17.033602 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.033596 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:21.033576361 +0000 UTC m=+10.082153469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:17.134089 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:17.134056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:17.134264 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.134218 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:17.134326 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.134273 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:21.134255702 +0000 UTC m=+10.182832822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:17.535260 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:17.534781 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:17.535260 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.534910 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:17.535260 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:17.535090 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:17.535260 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:17.535175 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:19.534545 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:19.534439 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:19.534972 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:19.534571 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:19.534972 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:19.534733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:19.534972 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:19.534856 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:21.067430 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:21.067380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:21.067826 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.067553 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:21.067826 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.067578 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:21.067826 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.067591 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:21.067826 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.067649 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:29.067631395 +0000 UTC m=+18.116208517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:21.167994 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:21.167960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:21.168187 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.168114 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:21.168187 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.168180 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:29.168163039 +0000 UTC m=+18.216740153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:21.536134 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:21.535747 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:21.536134 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.535860 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:21.536134 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:21.535893 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:21.536134 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:21.535990 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:23.535071 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:23.534981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:23.535492 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:23.535112 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:23.535492 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:23.535152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:23.535492 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:23.535241 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:25.537555 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:25.537527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:25.537992 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:25.537532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:25.537992 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:25.537651 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:25.537992 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:25.537726 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:27.537107 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:27.537076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:27.537610 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:27.537078 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:27.537610 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:27.537186 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:27.537610 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:27.537252 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:29.126681 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:29.126643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:29.127147 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.126832 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:29.127147 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.126855 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:29.127147 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.126868 2574 projected.go:194] Error preparing data for projected volume kube-api-access-n65ft for pod openshift-network-diagnostics/network-check-target-pq6xw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:29.127147 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.126932 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft podName:174399de-7e6b-4315-ba27-7e933c5c30d9 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:45.126912218 +0000 UTC m=+34.175489333 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n65ft" (UniqueName: "kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft") pod "network-check-target-pq6xw" (UID: "174399de-7e6b-4315-ba27-7e933c5c30d9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:29.227544 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:29.227506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:29.227739 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.227663 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:29.227807 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.227740 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:45.227720025 +0000 UTC m=+34.276297149 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:29.535292 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:29.535215 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:29.535445 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.535341 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:29.535445 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:29.535398 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:29.535564 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:29.535459 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:31.535490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.535345 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:31.535917 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:31.535543 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:31.535917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.535412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:31.535917 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:31.535630 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:31.628112 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.628082 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rllmj" event={"ID":"d6cfd862-350d-4749-a71a-3363dc8bbfa0","Type":"ContainerStarted","Data":"da9487f2e328f650764a49135c728338bc87824714c24bb0e70eadc184eca016"} Apr 16 08:33:31.630072 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:33:31.630372 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630351 2574 generic.go:358] "Generic (PLEG): container finished" podID="d6bc0f25-3003-4855-b122-6d1820717354" containerID="87f5fcafc8d6e7c1bde1f76518fedf26f9f67fc2baaa766dc02a2494cea6dfd4" exitCode=1 Apr 16 08:33:31.630460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"1c8916ce07130b3e7c4c6904c5b68a9c03dfb1327519d28ea778278dc1bc7ea2"} Apr 16 08:33:31.630460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"202d9aeb24e43303ada14b5dc776274530e2361647f08bb455f3882db5df2f1e"} Apr 16 08:33:31.630523 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerDied","Data":"87f5fcafc8d6e7c1bde1f76518fedf26f9f67fc2baaa766dc02a2494cea6dfd4"} Apr 16 08:33:31.630523 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.630482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"d39c347498ce02ca81f4b1cdd77bd8f04d30a4b157b653a8c580a531106fa0b0"} Apr 16 08:33:31.631896 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.631833 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b8fml" event={"ID":"e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb","Type":"ContainerStarted","Data":"a09f3018d19e95ebf976859d9cc37a49b1b89ef01d441db7e9eb233c84505e3e"} Apr 16 08:33:31.633146 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.633122 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wqhp5" event={"ID":"99c69840-b6dc-47aa-a435-9c9a49111d84","Type":"ContainerStarted","Data":"18b765dca5be4cbd93f019b18d9c59b4cd241aefc3858430dd68a54203029f0a"} Apr 16 08:33:31.634503 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.634476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" event={"ID":"93104adf-798d-4000-97f1-77b9402f3a86","Type":"ContainerStarted","Data":"b3dbab13edb37621c8faf505826743bbe3d3e3989a5bc856a49bf9ad05709870"} Apr 16 08:33:31.635781 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.635759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mvrgd" event={"ID":"eacd4fff-e409-4534-945c-507d909b8258","Type":"ContainerStarted","Data":"9b397aed78d8d9e3c51819a1bd2a17bfeb8f13c9cc9f0039a9d85232088920da"} Apr 16 08:33:31.637133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.637099 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" event={"ID":"ec8a52eb-3122-4a62-bca8-7bf7966c67e7","Type":"ContainerStarted","Data":"260c3fcd732944c8aa12bf590c7b4cb5571bbcacc3524768e6fc796a54640d9a"} Apr 16 08:33:31.638453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.638432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerStarted","Data":"ea9110a72e7289f3b477d70f48acd59a571f22fb1bd3ec934d70cdbdba94f813"} Apr 16 08:33:31.644310 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.644269 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rllmj" podStartSLOduration=11.60002674 podStartE2EDuration="20.644258397s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.080262428 +0000 UTC m=+3.128839539" lastFinishedPulling="2026-04-16 08:33:23.124494072 +0000 UTC m=+12.173071196" observedRunningTime="2026-04-16 08:33:31.644057057 +0000 UTC m=+20.692634186" watchObservedRunningTime="2026-04-16 08:33:31.644258397 +0000 UTC m=+20.692835526" Apr 16 08:33:31.644587 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.644559 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-115.ec2.internal" podStartSLOduration=19.644554867 podStartE2EDuration="19.644554867s" podCreationTimestamp="2026-04-16 08:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:16.621824142 +0000 UTC m=+5.670401273" watchObservedRunningTime="2026-04-16 08:33:31.644554867 +0000 UTC m=+20.693131998" Apr 16 08:33:31.661993 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.661951 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b8fml" podStartSLOduration=3.6763333620000003 podStartE2EDuration="20.661938935s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.077472173 +0000 UTC m=+3.126049280" lastFinishedPulling="2026-04-16 08:33:31.063077745 +0000 UTC m=+20.111654853" observedRunningTime="2026-04-16 08:33:31.661740797 +0000 UTC m=+20.710317927" watchObservedRunningTime="2026-04-16 08:33:31.661938935 +0000 UTC m=+20.710516063" Apr 16 08:33:31.676531 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.676495 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mvrgd" podStartSLOduration=3.699684959 podStartE2EDuration="20.676482773s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.076403218 +0000 UTC m=+3.124980328" lastFinishedPulling="2026-04-16 08:33:31.053201033 +0000 UTC m=+20.101778142" observedRunningTime="2026-04-16 08:33:31.67631463 +0000 UTC m=+20.724891753" watchObservedRunningTime="2026-04-16 08:33:31.676482773 +0000 UTC m=+20.725059902" Apr 16 08:33:31.690454 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.690416 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wqhp5" podStartSLOduration=3.705909039 podStartE2EDuration="20.690403094s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.068745785 +0000 UTC m=+3.117322892" lastFinishedPulling="2026-04-16 08:33:31.053239832 +0000 UTC m=+20.101816947" observedRunningTime="2026-04-16 08:33:31.690252792 +0000 UTC m=+20.738829936" watchObservedRunningTime="2026-04-16 08:33:31.690403094 +0000 UTC m=+20.738980223" Apr 16 08:33:31.727465 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:31.727430 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6rqnd" podStartSLOduration=3.723888349 podStartE2EDuration="20.72741732s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.07591903 +0000 UTC m=+3.124496199" lastFinishedPulling="2026-04-16 08:33:31.079448063 +0000 UTC m=+20.128025170" observedRunningTime="2026-04-16 08:33:31.705602215 +0000 UTC m=+20.754179346" watchObservedRunningTime="2026-04-16 08:33:31.72741732 +0000 UTC m=+20.775994448" Apr 16 08:33:32.419815 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.419791 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 08:33:32.449650 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.449582 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T08:33:32.419811009Z","UUID":"a3ac238c-4673-4ae8-9126-3ab22f75bc08","Handler":null,"Name":"","Endpoint":""} Apr 16 08:33:32.451039 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.451008 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 08:33:32.451126 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.451059 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 08:33:32.641441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.641371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" event={"ID":"93104adf-798d-4000-97f1-77b9402f3a86","Type":"ContainerStarted","Data":"0e4964549b9d780dc6b37e386e178e2b28bf998d0a7ea238c20c9e31d2b34864"} Apr 16 08:33:32.642589 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.642569 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="ea9110a72e7289f3b477d70f48acd59a571f22fb1bd3ec934d70cdbdba94f813" exitCode=0 Apr 16 08:33:32.642670 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.642625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"ea9110a72e7289f3b477d70f48acd59a571f22fb1bd3ec934d70cdbdba94f813"} Apr 16 08:33:32.643958 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.643935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2xl8x" event={"ID":"7f8ea69a-dea8-4b99-b06b-d678bbe4c26e","Type":"ContainerStarted","Data":"274d9004dcaa1656b46fcda2d8d7b872873f3c4d531f94611b47f58bed0adf2e"} Apr 16 08:33:32.646387 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.646372 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:33:32.646836 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.646798 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"a9223eeafa54af04269033c108c355a357f9cf1c087b89715e805b6f4dea0d92"} Apr 16 08:33:32.646919 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.646844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"a3f26ce8fb0f94b6e4c95b130bf35fd823d49acf8e6e649b4f82b359da1f5174"} Apr 16 08:33:32.679805 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:32.679770 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2xl8x" podStartSLOduration=4.732317373 podStartE2EDuration="21.679759365s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.080512311 +0000 UTC m=+3.129089419" lastFinishedPulling="2026-04-16 08:33:31.027954286 +0000 UTC m=+20.076531411" observedRunningTime="2026-04-16 08:33:32.679610403 +0000 UTC m=+21.728187529" watchObservedRunningTime="2026-04-16 08:33:32.679759365 +0000 UTC m=+21.728336494" Apr 16 08:33:33.534663 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:33.534635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:33.534844 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:33.534630 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:33.534844 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:33.534743 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:33.534844 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:33.534788 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:34.652815 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:34.652778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" event={"ID":"93104adf-798d-4000-97f1-77b9402f3a86","Type":"ContainerStarted","Data":"7aee3f57369b129d19d5249aed17a97d1eb6d96be3c801ed9986121af2ca5b5b"} Apr 16 08:33:34.658805 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:34.658594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:33:34.659146 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:34.659119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"3453bb2f88db065268afeab4a1e29f9dbd28fa9c876d9f2e2da7aa2baeabef18"} Apr 16 08:33:34.673232 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:34.673198 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qfzqw" podStartSLOduration=4.094808962 podStartE2EDuration="23.673187811s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.067982545 +0000 UTC m=+3.116559666" lastFinishedPulling="2026-04-16 08:33:33.646361395 +0000 UTC m=+22.694938515" observedRunningTime="2026-04-16 08:33:34.672927926 +0000 UTC m=+23.721505055" watchObservedRunningTime="2026-04-16 08:33:34.673187811 +0000 UTC m=+23.721764980" Apr 16 08:33:35.326314 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:35.326280 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:35.326935 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:35.326908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:35.534448 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:35.534415 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:35.534603 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:35.534424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:35.534603 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:35.534515 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:35.534682 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:35.534623 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:37.270413 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.270238 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:37.271044 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.270500 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 08:33:37.271044 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.270813 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rllmj" Apr 16 08:33:37.534545 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.534472 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:37.534651 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:37.534560 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:37.534651 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.534638 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:37.534743 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:37.534726 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:37.666789 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.666766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:33:37.667086 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.667066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"148c90745c06aa17d3d474b93b0141e8375d40932dc8b3926acb0ea4ccb10509"} Apr 16 08:33:37.667439 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.667405 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:37.667439 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.667447 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:37.667590 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.667580 2574 scope.go:117] "RemoveContainer" containerID="87f5fcafc8d6e7c1bde1f76518fedf26f9f67fc2baaa766dc02a2494cea6dfd4" Apr 16 08:33:37.668950 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.668926 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="a04a9b41ba7050288321e32d9ee1c9aa73c4ddf040b491c5dec032a46c7966b9" exitCode=0 Apr 16 08:33:37.669067 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.668998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"a04a9b41ba7050288321e32d9ee1c9aa73c4ddf040b491c5dec032a46c7966b9"} Apr 16 08:33:37.684006 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.683986 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:37.684100 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:37.684061 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:38.672632 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.672554 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="19d214e184dff4ca77a9da94fe5d7026823272befd3e403931b56693b95d3562" exitCode=0 Apr 16 08:33:38.673010 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.672627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"19d214e184dff4ca77a9da94fe5d7026823272befd3e403931b56693b95d3562"} Apr 16 08:33:38.676146 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.676125 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:33:38.676522 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.676501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" event={"ID":"d6bc0f25-3003-4855-b122-6d1820717354","Type":"ContainerStarted","Data":"9ec680ca8486e561dfbff3e878ad60eea416b227044ca3bf3a7aaa204667fcff"} Apr 16 08:33:38.676759 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.676741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:33:38.729557 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:38.729514 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" podStartSLOduration=10.704987477 podStartE2EDuration="27.729500199s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.078434425 +0000 UTC m=+3.127011532" lastFinishedPulling="2026-04-16 08:33:31.102947142 +0000 UTC m=+20.151524254" observedRunningTime="2026-04-16 08:33:38.729244346 +0000 UTC m=+27.777821500" watchObservedRunningTime="2026-04-16 08:33:38.729500199 +0000 UTC m=+27.778077328" Apr 16 08:33:39.061820 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.061450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pq6xw"] Apr 16 08:33:39.061820 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.061586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:39.061820 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:39.061685 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:39.065331 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.063052 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r9fn4"] Apr 16 08:33:39.065331 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.063196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:39.065331 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:39.063337 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:39.680644 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.680574 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="821e712e7eb8de5b377b45073deb1710fa9a7c6dd75dedff013beba0cdd96939" exitCode=0 Apr 16 08:33:39.681062 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:39.680668 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"821e712e7eb8de5b377b45073deb1710fa9a7c6dd75dedff013beba0cdd96939"} Apr 16 08:33:40.535297 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:40.535215 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:40.535456 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:40.535224 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:40.535456 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:40.535352 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:40.535456 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:40.535408 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:42.535701 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:42.535464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:42.536134 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:42.535527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:42.536134 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:42.535826 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:33:42.536134 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:42.535874 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pq6xw" podUID="174399de-7e6b-4315-ba27-7e933c5c30d9" Apr 16 08:33:44.241484 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.241408 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeReady" Apr 16 08:33:44.241926 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.241555 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 08:33:44.285898 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.285869 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zc2r7"] Apr 16 08:33:44.290739 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.290709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.293782 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.293758 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 08:33:44.293941 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.293827 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:33:44.293941 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.293765 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 08:33:44.294187 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.294152 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rnmq6"] Apr 16 08:33:44.297471 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.297452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:44.298976 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.298662 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zc2r7"] Apr 16 08:33:44.299608 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.299584 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 08:33:44.299835 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.299817 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 08:33:44.300139 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.300124 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 08:33:44.300238 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.300124 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:33:44.304444 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.304423 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rnmq6"] Apr 16 08:33:44.444124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngxnf\" (UniqueName: \"kubernetes.io/projected/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-kube-api-access-ngxnf\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.444124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6l9\" (UniqueName: \"kubernetes.io/projected/6ac45079-5104-4b55-acd6-dd06367716a0-kube-api-access-tx6l9\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:44.444356 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.444356 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-tmp-dir\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.444356 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-config-volume\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.444479 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.444382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:44.535010 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.534943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:44.535165 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.534943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:44.538329 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.538307 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:33:44.538453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.538375 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:33:44.538453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.538396 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:33:44.538453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.538313 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:33:44.538664 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.538314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-blgdz\"" Apr 16 08:33:44.545339 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngxnf\" (UniqueName: \"kubernetes.io/projected/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-kube-api-access-ngxnf\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.545441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6l9\" (UniqueName: \"kubernetes.io/projected/6ac45079-5104-4b55-acd6-dd06367716a0-kube-api-access-tx6l9\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:44.545441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.545441 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-tmp-dir\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.545613 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-config-volume\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.545613 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.545510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:44.545613 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:44.545514 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:44.545613 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:44.545582 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:33:45.045550115 +0000 UTC m=+34.094127225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:33:44.545965 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:44.545934 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:44.546124 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:44.546111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:45.046093725 +0000 UTC m=+34.094670842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:33:44.546434 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.546335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-tmp-dir\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.546895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.546694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-config-volume\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.559037 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.559000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngxnf\" (UniqueName: \"kubernetes.io/projected/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-kube-api-access-ngxnf\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:44.559190 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:44.559171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6l9\" (UniqueName: \"kubernetes.io/projected/6ac45079-5104-4b55-acd6-dd06367716a0-kube-api-access-tx6l9\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:45.048787 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.048751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:45.048973 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.048823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:45.048973 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.048918 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:45.048973 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.048925 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:45.048973 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.048973 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:46.048958084 +0000 UTC m=+35.097535191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:33:45.049154 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.048986 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:33:46.0489805 +0000 UTC m=+35.097557606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:33:45.149749 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.149722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:45.152262 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.152245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n65ft\" (UniqueName: \"kubernetes.io/projected/174399de-7e6b-4315-ba27-7e933c5c30d9-kube-api-access-n65ft\") pod \"network-check-target-pq6xw\" (UID: \"174399de-7e6b-4315-ba27-7e933c5c30d9\") " pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:45.152332 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.152316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:45.250529 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.250502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:33:45.251075 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.250611 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:33:45.251075 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:45.250663 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:17.250649854 +0000 UTC m=+66.299226960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : secret "metrics-daemon-secret" not found Apr 16 08:33:45.325986 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.325800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pq6xw"] Apr 16 08:33:45.438063 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:33:45.438005 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod174399de_7e6b_4315_ba27_7e933c5c30d9.slice/crio-88c5dd699df712adc716221d44c80befafc9bdcf36ecd2baf604b2a12896b724 WatchSource:0}: Error finding container 88c5dd699df712adc716221d44c80befafc9bdcf36ecd2baf604b2a12896b724: Status 404 returned error can't find the container with id 88c5dd699df712adc716221d44c80befafc9bdcf36ecd2baf604b2a12896b724 Apr 16 08:33:45.694479 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.694448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerStarted","Data":"c23ed105816b523c4976c64786f8c6e1b67c57040c2e584e559ef33b1cecb30a"} Apr 16 08:33:45.695538 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:45.695510 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pq6xw" event={"ID":"174399de-7e6b-4315-ba27-7e933c5c30d9","Type":"ContainerStarted","Data":"88c5dd699df712adc716221d44c80befafc9bdcf36ecd2baf604b2a12896b724"} Apr 16 08:33:46.055854 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:46.055768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:46.055854 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:46.055816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:46.056073 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:46.055939 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:46.056073 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:46.056001 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:33:48.055982998 +0000 UTC m=+37.104560105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:33:46.056073 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:46.055939 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:46.056260 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:46.056097 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:48.056078617 +0000 UTC m=+37.104655729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:33:46.699911 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:46.699876 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="c23ed105816b523c4976c64786f8c6e1b67c57040c2e584e559ef33b1cecb30a" exitCode=0 Apr 16 08:33:46.700297 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:46.699944 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"c23ed105816b523c4976c64786f8c6e1b67c57040c2e584e559ef33b1cecb30a"} Apr 16 08:33:47.705132 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:47.705102 2574 generic.go:358] "Generic (PLEG): container finished" podID="994019bc-fe5d-4c20-abc0-f589b27a59ca" containerID="6b47ef3d7ee650b955509ddd8b245a542c0bbd15dfdef31953ce7d80b1d341fe" exitCode=0 Apr 16 08:33:47.705132 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:47.705138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerDied","Data":"6b47ef3d7ee650b955509ddd8b245a542c0bbd15dfdef31953ce7d80b1d341fe"} Apr 16 08:33:48.070577 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.070487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:48.070577 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.070555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:48.070796 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:48.070648 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:48.070796 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:48.070675 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:48.070796 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:48.070718 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:52.070697987 +0000 UTC m=+41.119275099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:33:48.070796 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:48.070733 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:33:52.070727557 +0000 UTC m=+41.119304664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:33:48.710631 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.710447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87mwn" event={"ID":"994019bc-fe5d-4c20-abc0-f589b27a59ca","Type":"ContainerStarted","Data":"e552b8b17a84a281d071048c4484b88dfb09a9fbede817d822236b2039b6481d"} Apr 16 08:33:48.711692 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.711628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pq6xw" event={"ID":"174399de-7e6b-4315-ba27-7e933c5c30d9","Type":"ContainerStarted","Data":"762d9e4f48fac40191a996949d560263903d2b5b3e4b6c46e1a61823c4818eaa"} Apr 16 08:33:48.711812 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.711741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:33:48.736945 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.736629 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-87mwn" podStartSLOduration=6.336144535 podStartE2EDuration="37.736614464s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:14.074859635 +0000 UTC m=+3.123436742" lastFinishedPulling="2026-04-16 08:33:45.475329565 +0000 UTC m=+34.523906671" observedRunningTime="2026-04-16 08:33:48.735259131 +0000 UTC m=+37.783836242" watchObservedRunningTime="2026-04-16 08:33:48.736614464 +0000 UTC m=+37.785191593" Apr 16 08:33:48.751968 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:48.751932 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pq6xw" podStartSLOduration=34.742071908 podStartE2EDuration="37.751920107s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:45.453575505 +0000 UTC m=+34.502152613" lastFinishedPulling="2026-04-16 08:33:48.463423705 +0000 UTC m=+37.512000812" observedRunningTime="2026-04-16 08:33:48.751711501 +0000 UTC m=+37.800288630" watchObservedRunningTime="2026-04-16 08:33:48.751920107 +0000 UTC m=+37.800497235" Apr 16 08:33:52.095582 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:52.095544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:33:52.095582 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:33:52.095589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:33:52.095979 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:52.095701 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:52.095979 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:52.095742 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:52.095979 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:52.095778 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:00.095758084 +0000 UTC m=+49.144335194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:33:52.095979 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:33:52.095793 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:34:00.095786356 +0000 UTC m=+49.144363466 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:34:00.146253 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:00.146213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:34:00.146253 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:00.146260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:34:00.146832 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:00.146349 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:00.146832 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:00.146351 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:00.146832 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:00.146406 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:34:16.146391936 +0000 UTC m=+65.194969043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:34:00.146832 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:00.146419 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:16.146413536 +0000 UTC m=+65.194990643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:34:09.694400 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:09.694373 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f6wjp" Apr 16 08:34:16.151396 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:16.151352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:34:16.151396 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:16.151403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:34:16.151795 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:16.151492 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:16.151795 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:16.151497 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:16.151795 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:16.151541 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:34:48.151528943 +0000 UTC m=+97.200106050 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:34:16.151795 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:16.151554 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:48.151548382 +0000 UTC m=+97.200125490 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:34:17.257916 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:17.257868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:34:17.258330 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:17.258037 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:34:17.258330 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:17.258098 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:21.258083744 +0000 UTC m=+130.306660851 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : secret "metrics-daemon-secret" not found Apr 16 08:34:19.715395 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:19.715364 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pq6xw" Apr 16 08:34:48.251384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:48.251241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:34:48.251384 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:34:48.251300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:34:48.251384 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:48.251392 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:48.251900 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:48.251394 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:48.251900 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:48.251455 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls podName:5ee54e49-cfe0-4681-a0fd-a87ecc0d841c nodeName:}" failed. No retries permitted until 2026-04-16 08:35:52.25144023 +0000 UTC m=+161.300017337 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls") pod "dns-default-zc2r7" (UID: "5ee54e49-cfe0-4681-a0fd-a87ecc0d841c") : secret "dns-default-metrics-tls" not found Apr 16 08:34:48.251900 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:34:48.251470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert podName:6ac45079-5104-4b55-acd6-dd06367716a0 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:52.251463732 +0000 UTC m=+161.300040839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert") pod "ingress-canary-rnmq6" (UID: "6ac45079-5104-4b55-acd6-dd06367716a0") : secret "canary-serving-cert" not found Apr 16 08:35:15.175190 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.175158 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f5z6f"] Apr 16 08:35:15.180570 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.178689 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.181799 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.181772 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 08:35:15.181909 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.181830 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-845ml\"" Apr 16 08:35:15.182679 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.182658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.182799 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.182658 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.182799 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.182664 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 08:35:15.186282 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.186266 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 08:35:15.190831 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.190809 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f5z6f"] Apr 16 08:35:15.229115 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229080 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.229255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d76203c8-22cc-48c0-a9be-e10030af2601-serving-cert\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.229255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-tmp\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.229255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.229352 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-snapshots\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.229352 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.229279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nphls\" (UniqueName: \"kubernetes.io/projected/d76203c8-22cc-48c0-a9be-e10030af2601-kube-api-access-nphls\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.272046 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.272006 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt"] Apr 16 08:35:15.274707 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.274692 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.276896 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.276878 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 08:35:15.277759 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.277741 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.277851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.277743 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 08:35:15.277851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.277801 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7dgfk\"" Apr 16 08:35:15.277851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.277743 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.284175 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.284156 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt"] Apr 16 08:35:15.329862 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkhn\" (UniqueName: \"kubernetes.io/projected/b6d16575-3414-445e-b597-457d144a72f3-kube-api-access-2qkhn\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.329967 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-tmp\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.329967 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.329967 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-snapshots\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.329967 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nphls\" (UniqueName: \"kubernetes.io/projected/d76203c8-22cc-48c0-a9be-e10030af2601-kube-api-access-nphls\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.330172 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.329983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6d16575-3414-445e-b597-457d144a72f3-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.330172 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.330011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.330172 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.330080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.330172 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.330154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d76203c8-22cc-48c0-a9be-e10030af2601-serving-cert\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.330874 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.330855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-tmp\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.331082 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.331065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.331149 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.331084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d76203c8-22cc-48c0-a9be-e10030af2601-snapshots\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.331248 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.331232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76203c8-22cc-48c0-a9be-e10030af2601-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.334349 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.334332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d76203c8-22cc-48c0-a9be-e10030af2601-serving-cert\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.338731 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.338709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nphls\" (UniqueName: \"kubernetes.io/projected/d76203c8-22cc-48c0-a9be-e10030af2601-kube-api-access-nphls\") pod \"insights-operator-5785d4fcdd-f5z6f\" (UID: \"d76203c8-22cc-48c0-a9be-e10030af2601\") " pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.368496 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.368476 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr"] Apr 16 08:35:15.371450 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.371432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.372113 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.372093 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25"] Apr 16 08:35:15.373799 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.373780 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 08:35:15.373799 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.373788 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-r2mf6\"" Apr 16 08:35:15.373937 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.373780 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.373937 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.373816 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 08:35:15.374109 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.374092 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.374561 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.374545 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.374735 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.374704 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd"] Apr 16 08:35:15.376861 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.376841 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.376950 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.376911 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 08:35:15.376950 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.376923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.376950 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.376935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 08:35:15.377132 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.376994 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-fwg4j\"" Apr 16 08:35:15.377328 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.377311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" Apr 16 08:35:15.379403 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.379382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.379484 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.379423 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.379537 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.379489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-pj4wx\"" Apr 16 08:35:15.382754 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.382734 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr"] Apr 16 08:35:15.387901 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.387881 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25"] Apr 16 08:35:15.388869 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.388851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd"] Apr 16 08:35:15.430742 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feaae171-21c4-4aed-973a-8bfcf22b6913-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.430742 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkhn\" (UniqueName: \"kubernetes.io/projected/b6d16575-3414-445e-b597-457d144a72f3-kube-api-access-2qkhn\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.430742 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggtgt\" (UniqueName: \"kubernetes.io/projected/feaae171-21c4-4aed-973a-8bfcf22b6913-kube-api-access-ggtgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.430923 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feaae171-21c4-4aed-973a-8bfcf22b6913-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.430923 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-serving-cert\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.430923 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2846\" (UniqueName: \"kubernetes.io/projected/6e233b43-9590-4290-81b4-184a88df4ccf-kube-api-access-b2846\") pod \"volume-data-source-validator-7d955d5dd4-z8dmd\" (UID: \"6e233b43-9590-4290-81b4-184a88df4ccf\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" Apr 16 08:35:15.430923 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-config\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.430923 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lj5\" (UniqueName: \"kubernetes.io/projected/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-kube-api-access-l4lj5\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.431092 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6d16575-3414-445e-b597-457d144a72f3-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.431092 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.430966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.431092 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:15.431071 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:15.431182 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:15.431129 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:15.931111832 +0000 UTC m=+124.979688940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:15.431565 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.431548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b6d16575-3414-445e-b597-457d144a72f3-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.444662 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.444640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkhn\" (UniqueName: \"kubernetes.io/projected/b6d16575-3414-445e-b597-457d144a72f3-kube-api-access-2qkhn\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.489458 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.489438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" Apr 16 08:35:15.531773 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.531746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-serving-cert\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.531917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.531785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2846\" (UniqueName: \"kubernetes.io/projected/6e233b43-9590-4290-81b4-184a88df4ccf-kube-api-access-b2846\") pod \"volume-data-source-validator-7d955d5dd4-z8dmd\" (UID: \"6e233b43-9590-4290-81b4-184a88df4ccf\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" Apr 16 08:35:15.531917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.531807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-config\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.532047 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.531936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lj5\" (UniqueName: \"kubernetes.io/projected/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-kube-api-access-l4lj5\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.532047 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.532010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feaae171-21c4-4aed-973a-8bfcf22b6913-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.532147 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.532061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggtgt\" (UniqueName: \"kubernetes.io/projected/feaae171-21c4-4aed-973a-8bfcf22b6913-kube-api-access-ggtgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.532147 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.532090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feaae171-21c4-4aed-973a-8bfcf22b6913-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.532571 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.532547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-config\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.532571 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.532570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feaae171-21c4-4aed-973a-8bfcf22b6913-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.535673 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.534961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/feaae171-21c4-4aed-973a-8bfcf22b6913-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.535673 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.535632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-serving-cert\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.541448 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.541413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggtgt\" (UniqueName: \"kubernetes.io/projected/feaae171-21c4-4aed-973a-8bfcf22b6913-kube-api-access-ggtgt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-psv25\" (UID: \"feaae171-21c4-4aed-973a-8bfcf22b6913\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.541537 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.541477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2846\" (UniqueName: \"kubernetes.io/projected/6e233b43-9590-4290-81b4-184a88df4ccf-kube-api-access-b2846\") pod \"volume-data-source-validator-7d955d5dd4-z8dmd\" (UID: \"6e233b43-9590-4290-81b4-184a88df4ccf\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" Apr 16 08:35:15.541657 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.541641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lj5\" (UniqueName: \"kubernetes.io/projected/3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb-kube-api-access-l4lj5\") pod \"service-ca-operator-69965bb79d-zngbr\" (UID: \"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.605165 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.603362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-f5z6f"] Apr 16 08:35:15.607692 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:15.607664 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76203c8_22cc_48c0_a9be_e10030af2601.slice/crio-9056aa009cf92be2096b3e6c290a10ee36f2ee7e0fd7181854e4e5181034f11d WatchSource:0}: Error finding container 9056aa009cf92be2096b3e6c290a10ee36f2ee7e0fd7181854e4e5181034f11d: Status 404 returned error can't find the container with id 9056aa009cf92be2096b3e6c290a10ee36f2ee7e0fd7181854e4e5181034f11d Apr 16 08:35:15.682201 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.682148 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" Apr 16 08:35:15.688716 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.688694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" Apr 16 08:35:15.694308 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.694290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" Apr 16 08:35:15.809917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.809875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr"] Apr 16 08:35:15.814091 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:15.814058 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d23fa67_49e4_407a_b6b8_4bd17f8a1bfb.slice/crio-5b146a3aa21005dca54f90532cedcb46b9b25772fe8334d859b6675a9600c554 WatchSource:0}: Error finding container 5b146a3aa21005dca54f90532cedcb46b9b25772fe8334d859b6675a9600c554: Status 404 returned error can't find the container with id 5b146a3aa21005dca54f90532cedcb46b9b25772fe8334d859b6675a9600c554 Apr 16 08:35:15.872780 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.872750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" event={"ID":"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb","Type":"ContainerStarted","Data":"5b146a3aa21005dca54f90532cedcb46b9b25772fe8334d859b6675a9600c554"} Apr 16 08:35:15.873744 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.873726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" event={"ID":"d76203c8-22cc-48c0-a9be-e10030af2601","Type":"ContainerStarted","Data":"9056aa009cf92be2096b3e6c290a10ee36f2ee7e0fd7181854e4e5181034f11d"} Apr 16 08:35:15.936288 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:15.936222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:15.936379 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:15.936356 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:15.936423 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:15.936413 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:16.936398248 +0000 UTC m=+125.984975355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:16.029769 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:16.029740 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25"] Apr 16 08:35:16.032788 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:16.032765 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd"] Apr 16 08:35:16.033178 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:16.033155 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeaae171_21c4_4aed_973a_8bfcf22b6913.slice/crio-29d42e506955b0927d83712631a1400a4ad2e848332862b21a56081a45a6769c WatchSource:0}: Error finding container 29d42e506955b0927d83712631a1400a4ad2e848332862b21a56081a45a6769c: Status 404 returned error can't find the container with id 29d42e506955b0927d83712631a1400a4ad2e848332862b21a56081a45a6769c Apr 16 08:35:16.037172 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:16.037148 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e233b43_9590_4290_81b4_184a88df4ccf.slice/crio-1f93c45387b379baabaa5d7e80f88f23407e9ab2234eab76cbe4da1fe8c0b290 WatchSource:0}: Error finding container 1f93c45387b379baabaa5d7e80f88f23407e9ab2234eab76cbe4da1fe8c0b290: Status 404 returned error can't find the container with id 1f93c45387b379baabaa5d7e80f88f23407e9ab2234eab76cbe4da1fe8c0b290 Apr 16 08:35:16.877812 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:16.877745 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" event={"ID":"6e233b43-9590-4290-81b4-184a88df4ccf","Type":"ContainerStarted","Data":"1f93c45387b379baabaa5d7e80f88f23407e9ab2234eab76cbe4da1fe8c0b290"} Apr 16 08:35:16.879317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:16.879255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" event={"ID":"feaae171-21c4-4aed-973a-8bfcf22b6913","Type":"ContainerStarted","Data":"29d42e506955b0927d83712631a1400a4ad2e848332862b21a56081a45a6769c"} Apr 16 08:35:16.944626 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:16.944069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:16.944626 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:16.944219 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:16.944626 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:16.944283 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:18.944264413 +0000 UTC m=+127.992841522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:18.959365 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:18.959333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:18.959734 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:18.959469 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:18.959734 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:18.959527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:22.959511172 +0000 UTC m=+132.008088287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:19.645759 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.645726 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:35:19.649818 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.649799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.652282 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.652260 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 08:35:19.652404 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.652289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 08:35:19.652404 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.652331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dk699\"" Apr 16 08:35:19.652404 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.652268 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 08:35:19.658345 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.658157 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 08:35:19.661110 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.661082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:35:19.764960 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.764930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.764960 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.764968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765212 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.764988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765212 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.765085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765212 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.765141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765363 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.765212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765363 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.765250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.765363 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.765322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz7xq\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.866661 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.866826 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sz7xq\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.866889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.866948 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.867001 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.867001 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.866994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.867128 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.867071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.867128 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:19.867090 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:35:19.867128 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:19.867108 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b888d4bcc-4hfq4: secret "image-registry-tls" not found Apr 16 08:35:19.867263 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:19.867162 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls podName:cfa1c274-a26e-42db-87a1-64a66ad0269e nodeName:}" failed. No retries permitted until 2026-04-16 08:35:20.367142586 +0000 UTC m=+129.415719696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls") pod "image-registry-5b888d4bcc-4hfq4" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e") : secret "image-registry-tls" not found Apr 16 08:35:19.867263 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.867160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.867263 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.867238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.868349 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.868324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.868984 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.868958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.870315 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.870278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.870571 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.870553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.878928 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.878907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.879335 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.879307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz7xq\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:19.887393 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.887360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" event={"ID":"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb","Type":"ContainerStarted","Data":"8b3565b25ed6c0913dfa1eeb7d432fd10bc5000e2c7280d26b96cdca7efffb35"} Apr 16 08:35:19.888811 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.888786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" event={"ID":"d76203c8-22cc-48c0-a9be-e10030af2601","Type":"ContainerStarted","Data":"4bb58c07935594ac13392753b52a49624b2caf838eb3d581e73ba2c3b1cf8728"} Apr 16 08:35:19.890069 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.890042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" event={"ID":"6e233b43-9590-4290-81b4-184a88df4ccf","Type":"ContainerStarted","Data":"f9dd0badabe27649fc975fa5f7f4dc0433b0512c8209795d54db35716fb831a3"} Apr 16 08:35:19.891348 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.891318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" event={"ID":"feaae171-21c4-4aed-973a-8bfcf22b6913","Type":"ContainerStarted","Data":"30e30e045d63d1a7648a917e29458433b96965883980d4b644a4153f8c7ff0da"} Apr 16 08:35:19.904412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.904329 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" podStartSLOduration=1.719457974 podStartE2EDuration="4.904314838s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:15.815792121 +0000 UTC m=+124.864369228" lastFinishedPulling="2026-04-16 08:35:19.000648961 +0000 UTC m=+128.049226092" observedRunningTime="2026-04-16 08:35:19.903592245 +0000 UTC m=+128.952169387" watchObservedRunningTime="2026-04-16 08:35:19.904314838 +0000 UTC m=+128.952891968" Apr 16 08:35:19.920504 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.920465 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" podStartSLOduration=1.9576453539999998 podStartE2EDuration="4.920453651s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:16.035057877 +0000 UTC m=+125.083634984" lastFinishedPulling="2026-04-16 08:35:18.997866159 +0000 UTC m=+128.046443281" observedRunningTime="2026-04-16 08:35:19.920046778 +0000 UTC m=+128.968623912" watchObservedRunningTime="2026-04-16 08:35:19.920453651 +0000 UTC m=+128.969030821" Apr 16 08:35:19.935464 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.935245 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-z8dmd" podStartSLOduration=1.978863999 podStartE2EDuration="4.935170952s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:16.038705111 +0000 UTC m=+125.087282218" lastFinishedPulling="2026-04-16 08:35:18.995012065 +0000 UTC m=+128.043589171" observedRunningTime="2026-04-16 08:35:19.93458292 +0000 UTC m=+128.983160049" watchObservedRunningTime="2026-04-16 08:35:19.935170952 +0000 UTC m=+128.983748079" Apr 16 08:35:19.956538 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:19.956503 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" podStartSLOduration=1.571453626 podStartE2EDuration="4.956492997s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:15.609340471 +0000 UTC m=+124.657917577" lastFinishedPulling="2026-04-16 08:35:18.994379839 +0000 UTC m=+128.042956948" observedRunningTime="2026-04-16 08:35:19.955706132 +0000 UTC m=+129.004283262" watchObservedRunningTime="2026-04-16 08:35:19.956492997 +0000 UTC m=+129.005070126" Apr 16 08:35:20.372063 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.372005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:20.372414 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:20.372155 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:35:20.372414 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:20.372173 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b888d4bcc-4hfq4: secret "image-registry-tls" not found Apr 16 08:35:20.372414 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:20.372232 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls podName:cfa1c274-a26e-42db-87a1-64a66ad0269e nodeName:}" failed. No retries permitted until 2026-04-16 08:35:21.372216048 +0000 UTC m=+130.420793156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls") pod "image-registry-5b888d4bcc-4hfq4" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e") : secret "image-registry-tls" not found Apr 16 08:35:20.690562 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.690531 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48"] Apr 16 08:35:20.694303 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.694288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" Apr 16 08:35:20.696363 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.696342 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dnq4l\"" Apr 16 08:35:20.701845 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.701825 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48"] Apr 16 08:35:20.775433 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.775406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmhr\" (UniqueName: \"kubernetes.io/projected/ae71d722-6e3e-4e76-b3a7-90b11657ce93-kube-api-access-kvmhr\") pod \"network-check-source-7b678d77c7-nzw48\" (UID: \"ae71d722-6e3e-4e76-b3a7-90b11657ce93\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" Apr 16 08:35:20.876369 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.876344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmhr\" (UniqueName: \"kubernetes.io/projected/ae71d722-6e3e-4e76-b3a7-90b11657ce93-kube-api-access-kvmhr\") pod \"network-check-source-7b678d77c7-nzw48\" (UID: \"ae71d722-6e3e-4e76-b3a7-90b11657ce93\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" Apr 16 08:35:20.892609 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:20.892584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmhr\" (UniqueName: \"kubernetes.io/projected/ae71d722-6e3e-4e76-b3a7-90b11657ce93-kube-api-access-kvmhr\") pod \"network-check-source-7b678d77c7-nzw48\" (UID: \"ae71d722-6e3e-4e76-b3a7-90b11657ce93\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" Apr 16 08:35:21.003836 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.003779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" Apr 16 08:35:21.116498 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.116472 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48"] Apr 16 08:35:21.119207 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:21.119180 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae71d722_6e3e_4e76_b3a7_90b11657ce93.slice/crio-02ce2574763fd2d86d9928199ca97d8ae7fa9cc785005d41b946aebb48981c12 WatchSource:0}: Error finding container 02ce2574763fd2d86d9928199ca97d8ae7fa9cc785005d41b946aebb48981c12: Status 404 returned error can't find the container with id 02ce2574763fd2d86d9928199ca97d8ae7fa9cc785005d41b946aebb48981c12 Apr 16 08:35:21.279863 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.279790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:35:21.279991 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:21.279928 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:35:21.279991 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:21.279987 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs podName:b1290b06-222c-45ae-985a-c88370488114 nodeName:}" failed. No retries permitted until 2026-04-16 08:37:23.279973314 +0000 UTC m=+252.328550420 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs") pod "network-metrics-daemon-r9fn4" (UID: "b1290b06-222c-45ae-985a-c88370488114") : secret "metrics-daemon-secret" not found Apr 16 08:35:21.380667 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.380639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:21.380990 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:21.380796 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:35:21.380990 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:21.380814 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b888d4bcc-4hfq4: secret "image-registry-tls" not found Apr 16 08:35:21.380990 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:21.380870 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls podName:cfa1c274-a26e-42db-87a1-64a66ad0269e nodeName:}" failed. No retries permitted until 2026-04-16 08:35:23.38085328 +0000 UTC m=+132.429430400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls") pod "image-registry-5b888d4bcc-4hfq4" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e") : secret "image-registry-tls" not found Apr 16 08:35:21.900329 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.900294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" event={"ID":"ae71d722-6e3e-4e76-b3a7-90b11657ce93","Type":"ContainerStarted","Data":"08a1d58c7b049883c2db4b6e7c49983611c65ac54991b7d70a0535a9b38269c1"} Apr 16 08:35:21.900329 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.900331 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" event={"ID":"ae71d722-6e3e-4e76-b3a7-90b11657ce93","Type":"ContainerStarted","Data":"02ce2574763fd2d86d9928199ca97d8ae7fa9cc785005d41b946aebb48981c12"} Apr 16 08:35:21.917684 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:21.917601 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-nzw48" podStartSLOduration=1.9175855739999998 podStartE2EDuration="1.917585574s" podCreationTimestamp="2026-04-16 08:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:35:21.917355967 +0000 UTC m=+130.965933097" watchObservedRunningTime="2026-04-16 08:35:21.917585574 +0000 UTC m=+130.966162702" Apr 16 08:35:22.058085 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:22.058057 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wqhp5_99c69840-b6dc-47aa-a435-9c9a49111d84/dns-node-resolver/0.log" Apr 16 08:35:22.993929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:22.993894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:22.994329 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:22.994062 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:22.994329 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:22.994152 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:30.99413114 +0000 UTC m=+140.042708266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:23.058322 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:23.058297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mvrgd_eacd4fff-e409-4534-945c-507d909b8258/node-ca/0.log" Apr 16 08:35:23.397612 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:23.397519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:23.397765 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:23.397645 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:35:23.397765 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:23.397661 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b888d4bcc-4hfq4: secret "image-registry-tls" not found Apr 16 08:35:23.397765 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:23.397730 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls podName:cfa1c274-a26e-42db-87a1-64a66ad0269e nodeName:}" failed. No retries permitted until 2026-04-16 08:35:27.397712199 +0000 UTC m=+136.446289306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls") pod "image-registry-5b888d4bcc-4hfq4" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e") : secret "image-registry-tls" not found Apr 16 08:35:27.429955 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:27.429098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:27.429955 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:27.429597 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:35:27.429955 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:27.429638 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b888d4bcc-4hfq4: secret "image-registry-tls" not found Apr 16 08:35:27.429955 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:27.429708 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls podName:cfa1c274-a26e-42db-87a1-64a66ad0269e nodeName:}" failed. No retries permitted until 2026-04-16 08:35:35.429681339 +0000 UTC m=+144.478258463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls") pod "image-registry-5b888d4bcc-4hfq4" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e") : secret "image-registry-tls" not found Apr 16 08:35:31.056205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:31.056167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:31.056570 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:31.056300 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:31.056570 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:31.056364 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls podName:b6d16575-3414-445e-b597-457d144a72f3 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:47.056346981 +0000 UTC m=+156.104924089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-6q9xt" (UID: "b6d16575-3414-445e-b597-457d144a72f3") : secret "cluster-monitoring-operator-tls" not found Apr 16 08:35:35.488710 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.488674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:35.491066 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.491034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"image-registry-5b888d4bcc-4hfq4\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:35.561878 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.561850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:35.680296 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.680270 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:35:35.683497 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:35.683466 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa1c274_a26e_42db_87a1_64a66ad0269e.slice/crio-a57533f684f7e29a17c93dc7843424d1c692452610f3939e1812840f84a07471 WatchSource:0}: Error finding container a57533f684f7e29a17c93dc7843424d1c692452610f3939e1812840f84a07471: Status 404 returned error can't find the container with id a57533f684f7e29a17c93dc7843424d1c692452610f3939e1812840f84a07471 Apr 16 08:35:35.936924 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.936881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" event={"ID":"cfa1c274-a26e-42db-87a1-64a66ad0269e","Type":"ContainerStarted","Data":"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb"} Apr 16 08:35:35.936924 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.936928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" event={"ID":"cfa1c274-a26e-42db-87a1-64a66ad0269e","Type":"ContainerStarted","Data":"a57533f684f7e29a17c93dc7843424d1c692452610f3939e1812840f84a07471"} Apr 16 08:35:35.937145 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.937011 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:35.956376 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:35.956331 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" podStartSLOduration=16.956317365 podStartE2EDuration="16.956317365s" podCreationTimestamp="2026-04-16 08:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:35:35.955581487 +0000 UTC m=+145.004158616" watchObservedRunningTime="2026-04-16 08:35:35.956317365 +0000 UTC m=+145.004894494" Apr 16 08:35:43.616657 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.616623 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h85fk"] Apr 16 08:35:43.622917 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.622892 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.625378 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.625354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 08:35:43.626428 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.626413 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 08:35:43.626502 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.626434 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rqvkt\"" Apr 16 08:35:43.632649 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.632632 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h85fk"] Apr 16 08:35:43.667202 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.667180 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-qhwnd"] Apr 16 08:35:43.670216 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.670203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:35:43.673015 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.672995 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-pc2fc\"" Apr 16 08:35:43.673118 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.673000 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 08:35:43.673385 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.673354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 08:35:43.675507 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.675485 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:35:43.685583 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.685559 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-qhwnd"] Apr 16 08:35:43.751325 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e8741a85-3d9a-4923-833d-ff0cdacf96dd-crio-socket\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.751454 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkkg\" (UniqueName: \"kubernetes.io/projected/bf11a112-7c77-4855-924a-4cbe4f4b77eb-kube-api-access-kkkkg\") pod \"downloads-586b57c7b4-qhwnd\" (UID: \"bf11a112-7c77-4855-924a-4cbe4f4b77eb\") " pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:35:43.751454 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.751454 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e8741a85-3d9a-4923-833d-ff0cdacf96dd-data-volume\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.751454 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e8741a85-3d9a-4923-833d-ff0cdacf96dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.751588 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.751471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4dl\" (UniqueName: \"kubernetes.io/projected/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-api-access-cm4dl\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852253 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4dl\" (UniqueName: \"kubernetes.io/projected/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-api-access-cm4dl\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852366 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e8741a85-3d9a-4923-833d-ff0cdacf96dd-crio-socket\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852366 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkkg\" (UniqueName: \"kubernetes.io/projected/bf11a112-7c77-4855-924a-4cbe4f4b77eb-kube-api-access-kkkkg\") pod \"downloads-586b57c7b4-qhwnd\" (UID: \"bf11a112-7c77-4855-924a-4cbe4f4b77eb\") " pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:35:43.852366 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852366 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e8741a85-3d9a-4923-833d-ff0cdacf96dd-data-volume\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852550 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e8741a85-3d9a-4923-833d-ff0cdacf96dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852550 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e8741a85-3d9a-4923-833d-ff0cdacf96dd-crio-socket\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852771 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e8741a85-3d9a-4923-833d-ff0cdacf96dd-data-volume\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.852891 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.852875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.854627 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.854607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e8741a85-3d9a-4923-833d-ff0cdacf96dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.864639 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.864615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4dl\" (UniqueName: \"kubernetes.io/projected/e8741a85-3d9a-4923-833d-ff0cdacf96dd-kube-api-access-cm4dl\") pod \"insights-runtime-extractor-h85fk\" (UID: \"e8741a85-3d9a-4923-833d-ff0cdacf96dd\") " pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.864714 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.864675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkkg\" (UniqueName: \"kubernetes.io/projected/bf11a112-7c77-4855-924a-4cbe4f4b77eb-kube-api-access-kkkkg\") pod \"downloads-586b57c7b4-qhwnd\" (UID: \"bf11a112-7c77-4855-924a-4cbe4f4b77eb\") " pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:35:43.931797 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.931778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h85fk" Apr 16 08:35:43.978182 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:43.978103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:35:44.056057 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.056007 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h85fk"] Apr 16 08:35:44.107007 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.106966 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-qhwnd"] Apr 16 08:35:44.110228 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:44.110204 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf11a112_7c77_4855_924a_4cbe4f4b77eb.slice/crio-c73ff1d29ef096826d27a77cb24b56fbab9a684557d8c3b55d1a68f8303c0305 WatchSource:0}: Error finding container c73ff1d29ef096826d27a77cb24b56fbab9a684557d8c3b55d1a68f8303c0305: Status 404 returned error can't find the container with id c73ff1d29ef096826d27a77cb24b56fbab9a684557d8c3b55d1a68f8303c0305 Apr 16 08:35:44.961941 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.961906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-qhwnd" event={"ID":"bf11a112-7c77-4855-924a-4cbe4f4b77eb","Type":"ContainerStarted","Data":"c73ff1d29ef096826d27a77cb24b56fbab9a684557d8c3b55d1a68f8303c0305"} Apr 16 08:35:44.963565 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.963538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h85fk" event={"ID":"e8741a85-3d9a-4923-833d-ff0cdacf96dd","Type":"ContainerStarted","Data":"b24d020876c27ae65ba005279cc4743c388cf72a3d7efd5705e08ba511a69749"} Apr 16 08:35:44.963565 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.963568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h85fk" event={"ID":"e8741a85-3d9a-4923-833d-ff0cdacf96dd","Type":"ContainerStarted","Data":"4d8c3f38559f12bbb8ec749c2ad4f06080a18785e3bbf191852e959e886f83af"} Apr 16 08:35:44.963724 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:44.963577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h85fk" event={"ID":"e8741a85-3d9a-4923-833d-ff0cdacf96dd","Type":"ContainerStarted","Data":"42a8e76dab2c8a48d8efc7f73e8cbc324ed878df6c28aab557c592f047139874"} Apr 16 08:35:46.972674 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:46.972639 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h85fk" event={"ID":"e8741a85-3d9a-4923-833d-ff0cdacf96dd","Type":"ContainerStarted","Data":"fbc329d7ab625c8c41671ea21e4052178abd9e26d2142cffe73ec7ca6fb2dc4b"} Apr 16 08:35:46.992806 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:46.992758 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h85fk" podStartSLOduration=1.977697536 podStartE2EDuration="3.992743601s" podCreationTimestamp="2026-04-16 08:35:43 +0000 UTC" firstStartedPulling="2026-04-16 08:35:44.126962303 +0000 UTC m=+153.175539409" lastFinishedPulling="2026-04-16 08:35:46.142008362 +0000 UTC m=+155.190585474" observedRunningTime="2026-04-16 08:35:46.991363186 +0000 UTC m=+156.039940316" watchObservedRunningTime="2026-04-16 08:35:46.992743601 +0000 UTC m=+156.041320708" Apr 16 08:35:47.074744 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.074702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:47.077553 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.077524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6d16575-3414-445e-b597-457d144a72f3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-6q9xt\" (UID: \"b6d16575-3414-445e-b597-457d144a72f3\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:47.083388 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.083354 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" Apr 16 08:35:47.226330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.226041 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt"] Apr 16 08:35:47.228883 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:47.228850 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d16575_3414_445e_b597_457d144a72f3.slice/crio-d1a22039a2b0241f36e72803c0d3756c31f1fcd94b8b2973aa9be834a24c5f8d WatchSource:0}: Error finding container d1a22039a2b0241f36e72803c0d3756c31f1fcd94b8b2973aa9be834a24c5f8d: Status 404 returned error can't find the container with id d1a22039a2b0241f36e72803c0d3756c31f1fcd94b8b2973aa9be834a24c5f8d Apr 16 08:35:47.303643 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:47.303609 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zc2r7" podUID="5ee54e49-cfe0-4681-a0fd-a87ecc0d841c" Apr 16 08:35:47.310796 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:47.310768 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rnmq6" podUID="6ac45079-5104-4b55-acd6-dd06367716a0" Apr 16 08:35:47.545743 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:35:47.545657 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-r9fn4" podUID="b1290b06-222c-45ae-985a-c88370488114" Apr 16 08:35:47.976354 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.976204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zc2r7" Apr 16 08:35:47.976354 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.976217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:35:47.976354 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:47.976258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" event={"ID":"b6d16575-3414-445e-b597-457d144a72f3","Type":"ContainerStarted","Data":"d1a22039a2b0241f36e72803c0d3756c31f1fcd94b8b2973aa9be834a24c5f8d"} Apr 16 08:35:49.983972 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:49.983940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" event={"ID":"b6d16575-3414-445e-b597-457d144a72f3","Type":"ContainerStarted","Data":"ce1d5db2ef34224798cb6ec251d1eaaa6fa7b56a8a9488fb15e98acbfc560666"} Apr 16 08:35:50.000687 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:50.000637 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-6q9xt" podStartSLOduration=33.239757256 podStartE2EDuration="35.000621053s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:47.231053019 +0000 UTC m=+156.279630125" lastFinishedPulling="2026-04-16 08:35:48.991916812 +0000 UTC m=+158.040493922" observedRunningTime="2026-04-16 08:35:50.000101571 +0000 UTC m=+159.048678739" watchObservedRunningTime="2026-04-16 08:35:50.000621053 +0000 UTC m=+159.049198187" Apr 16 08:35:52.320403 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.320361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:35:52.320963 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.320419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:35:52.323422 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.323389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee54e49-cfe0-4681-a0fd-a87ecc0d841c-metrics-tls\") pod \"dns-default-zc2r7\" (UID: \"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c\") " pod="openshift-dns/dns-default-zc2r7" Apr 16 08:35:52.323548 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.323448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac45079-5104-4b55-acd6-dd06367716a0-cert\") pod \"ingress-canary-rnmq6\" (UID: \"6ac45079-5104-4b55-acd6-dd06367716a0\") " pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:35:52.480005 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.479964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:35:52.480697 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.480680 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:35:52.488539 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.488517 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zc2r7" Apr 16 08:35:52.488660 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.488521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rnmq6" Apr 16 08:35:52.628555 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.628524 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rnmq6"] Apr 16 08:35:52.633650 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:52.633621 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac45079_5104_4b55_acd6_dd06367716a0.slice/crio-8ba5ea21f6db5b6ce9aa09d61c96bc0cd1b803dc84d5988235961cc2dcece374 WatchSource:0}: Error finding container 8ba5ea21f6db5b6ce9aa09d61c96bc0cd1b803dc84d5988235961cc2dcece374: Status 404 returned error can't find the container with id 8ba5ea21f6db5b6ce9aa09d61c96bc0cd1b803dc84d5988235961cc2dcece374 Apr 16 08:35:52.647504 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.647483 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zc2r7"] Apr 16 08:35:52.650822 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:35:52.650782 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee54e49_cfe0_4681_a0fd_a87ecc0d841c.slice/crio-87adb8869276cf316ab3f8fcaafc22b346eef39ecd5beda8b8fd71afc613964a WatchSource:0}: Error finding container 87adb8869276cf316ab3f8fcaafc22b346eef39ecd5beda8b8fd71afc613964a: Status 404 returned error can't find the container with id 87adb8869276cf316ab3f8fcaafc22b346eef39ecd5beda8b8fd71afc613964a Apr 16 08:35:52.994431 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.994389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zc2r7" event={"ID":"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c","Type":"ContainerStarted","Data":"87adb8869276cf316ab3f8fcaafc22b346eef39ecd5beda8b8fd71afc613964a"} Apr 16 08:35:52.995727 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:52.995687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rnmq6" event={"ID":"6ac45079-5104-4b55-acd6-dd06367716a0","Type":"ContainerStarted","Data":"8ba5ea21f6db5b6ce9aa09d61c96bc0cd1b803dc84d5988235961cc2dcece374"} Apr 16 08:35:53.614534 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.614492 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mt5k4"] Apr 16 08:35:53.617939 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.617916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.620518 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.620498 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 08:35:53.621330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.621298 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-42m5c\"" Apr 16 08:35:53.621673 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.621523 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 08:35:53.621673 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.621578 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 08:35:53.632896 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.632865 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mt5k4"] Apr 16 08:35:53.682465 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.682440 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:35:53.731942 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.731914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmhh\" (UniqueName: \"kubernetes.io/projected/979e002d-fc74-4d7d-ab0d-89e5d381e244-kube-api-access-jvmhh\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.732128 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.732048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/979e002d-fc74-4d7d-ab0d-89e5d381e244-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.732196 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.732130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.732196 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.732163 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.833418 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.833383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmhh\" (UniqueName: \"kubernetes.io/projected/979e002d-fc74-4d7d-ab0d-89e5d381e244-kube-api-access-jvmhh\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.833568 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.833464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/979e002d-fc74-4d7d-ab0d-89e5d381e244-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.833568 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.833531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.833568 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.833563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.834319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.834270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/979e002d-fc74-4d7d-ab0d-89e5d381e244-metrics-client-ca\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.836286 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.836262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.836397 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.836377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/979e002d-fc74-4d7d-ab0d-89e5d381e244-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.845520 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.845497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmhh\" (UniqueName: \"kubernetes.io/projected/979e002d-fc74-4d7d-ab0d-89e5d381e244-kube-api-access-jvmhh\") pod \"prometheus-operator-78f957474d-mt5k4\" (UID: \"979e002d-fc74-4d7d-ab0d-89e5d381e244\") " pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:53.930215 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:53.930187 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" Apr 16 08:35:56.090890 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.090858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:35:56.093948 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.093918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.096318 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.096294 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 08:35:56.097276 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.097255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 08:35:56.097397 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.097297 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 08:35:56.097481 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.097457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 08:35:56.097620 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.097509 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 08:35:56.097620 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.097560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jcp87\"" Apr 16 08:35:56.101498 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.101480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 08:35:56.104724 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.104700 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:35:56.150975 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.150904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.150975 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.150935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.150975 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.150963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.151204 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.151005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.151204 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.151119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.151204 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.151179 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.151308 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.151244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk7l8\" (UniqueName: \"kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251632 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk7l8\" (UniqueName: \"kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251800 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251800 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251913 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251913 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251913 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.251913 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.251900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.252651 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.252604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.252757 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.252648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.252757 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.252688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.253243 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.253221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.254504 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.254483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.255297 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.255271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.265287 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.265260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk7l8\" (UniqueName: \"kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8\") pod \"console-654675bfc8-v5st9\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:56.406180 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:56.406089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:35:59.534774 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:35:59.534741 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:36:01.008827 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.008782 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-mt5k4"] Apr 16 08:36:01.015176 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:01.015141 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod979e002d_fc74_4d7d_ab0d_89e5d381e244.slice/crio-12c8cb6a6d3021fab7ef9a413fb05de8d42c4fbf3ceeff42b6a0737bfe67954c WatchSource:0}: Error finding container 12c8cb6a6d3021fab7ef9a413fb05de8d42c4fbf3ceeff42b6a0737bfe67954c: Status 404 returned error can't find the container with id 12c8cb6a6d3021fab7ef9a413fb05de8d42c4fbf3ceeff42b6a0737bfe67954c Apr 16 08:36:01.022344 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.022312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-qhwnd" event={"ID":"bf11a112-7c77-4855-924a-4cbe4f4b77eb","Type":"ContainerStarted","Data":"4eae3add9d6aedafb1c26925e9799a65382afdbb079711cc059c8965e87616db"} Apr 16 08:36:01.023358 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.023336 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:36:01.024326 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.024299 2574 patch_prober.go:28] interesting pod/downloads-586b57c7b4-qhwnd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" start-of-body= Apr 16 08:36:01.024411 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.024352 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-qhwnd" podUID="bf11a112-7c77-4855-924a-4cbe4f4b77eb" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.15:8080/\": dial tcp 10.134.0.15:8080: connect: connection refused" Apr 16 08:36:01.025297 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.025261 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rnmq6" event={"ID":"6ac45079-5104-4b55-acd6-dd06367716a0","Type":"ContainerStarted","Data":"d8c76e4e3b8fabc42d68cf5e37f4b922936d3be6351a6b95325577e3987c55b0"} Apr 16 08:36:01.026590 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.026563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" event={"ID":"979e002d-fc74-4d7d-ab0d-89e5d381e244","Type":"ContainerStarted","Data":"12c8cb6a6d3021fab7ef9a413fb05de8d42c4fbf3ceeff42b6a0737bfe67954c"} Apr 16 08:36:01.038800 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.038777 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:36:01.041993 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.041633 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-qhwnd" podStartSLOduration=1.28059389 podStartE2EDuration="18.041617077s" podCreationTimestamp="2026-04-16 08:35:43 +0000 UTC" firstStartedPulling="2026-04-16 08:35:44.112138823 +0000 UTC m=+153.160715931" lastFinishedPulling="2026-04-16 08:36:00.873162003 +0000 UTC m=+169.921739118" observedRunningTime="2026-04-16 08:36:01.040388906 +0000 UTC m=+170.088966035" watchObservedRunningTime="2026-04-16 08:36:01.041617077 +0000 UTC m=+170.090194207" Apr 16 08:36:01.044244 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:01.044215 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod729825ae_d577_487b_9eca_273b4b9c19db.slice/crio-3c41ed9d3bbc497aca95021bbe0e8d3dcfafc58782e2db500b4a4e3327a0e6b8 WatchSource:0}: Error finding container 3c41ed9d3bbc497aca95021bbe0e8d3dcfafc58782e2db500b4a4e3327a0e6b8: Status 404 returned error can't find the container with id 3c41ed9d3bbc497aca95021bbe0e8d3dcfafc58782e2db500b4a4e3327a0e6b8 Apr 16 08:36:01.057772 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:01.057713 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rnmq6" podStartSLOduration=128.833611035 podStartE2EDuration="2m17.057697899s" podCreationTimestamp="2026-04-16 08:33:44 +0000 UTC" firstStartedPulling="2026-04-16 08:35:52.636051885 +0000 UTC m=+161.684628992" lastFinishedPulling="2026-04-16 08:36:00.860138732 +0000 UTC m=+169.908715856" observedRunningTime="2026-04-16 08:36:01.056856128 +0000 UTC m=+170.105433258" watchObservedRunningTime="2026-04-16 08:36:01.057697899 +0000 UTC m=+170.106275026" Apr 16 08:36:02.034274 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.034192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zc2r7" event={"ID":"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c","Type":"ContainerStarted","Data":"cd90076910794725fca54643e42bd0b8532f4117f4085523521cc6c2bd371e72"} Apr 16 08:36:02.034274 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.034244 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zc2r7" event={"ID":"5ee54e49-cfe0-4681-a0fd-a87ecc0d841c","Type":"ContainerStarted","Data":"4f7cafa4f4ae97f5cecfe4db5d73b0096ee3f09d4b6d9043a18d4ad8f464e511"} Apr 16 08:36:02.035453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.034520 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zc2r7" Apr 16 08:36:02.036839 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.036787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654675bfc8-v5st9" event={"ID":"729825ae-d577-487b-9eca-273b4b9c19db","Type":"ContainerStarted","Data":"3c41ed9d3bbc497aca95021bbe0e8d3dcfafc58782e2db500b4a4e3327a0e6b8"} Apr 16 08:36:02.049851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.049814 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-qhwnd" Apr 16 08:36:02.094897 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:02.094774 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zc2r7" podStartSLOduration=129.891274283 podStartE2EDuration="2m18.09475522s" podCreationTimestamp="2026-04-16 08:33:44 +0000 UTC" firstStartedPulling="2026-04-16 08:35:52.652691514 +0000 UTC m=+161.701268621" lastFinishedPulling="2026-04-16 08:36:00.856172441 +0000 UTC m=+169.904749558" observedRunningTime="2026-04-16 08:36:02.057655819 +0000 UTC m=+171.106232957" watchObservedRunningTime="2026-04-16 08:36:02.09475522 +0000 UTC m=+171.143332354" Apr 16 08:36:03.041899 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:03.041802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" event={"ID":"979e002d-fc74-4d7d-ab0d-89e5d381e244","Type":"ContainerStarted","Data":"3adc8452acdc6b13404f6c79240d78d4f4cd076630f2ca3be23991323a76f5e2"} Apr 16 08:36:03.041899 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:03.041848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" event={"ID":"979e002d-fc74-4d7d-ab0d-89e5d381e244","Type":"ContainerStarted","Data":"26f95ecf183356ee7d907dcbee8e2329298a7328a9f141eed239c23f6aa36ce3"} Apr 16 08:36:03.065846 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:03.065755 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-mt5k4" podStartSLOduration=8.582748005 podStartE2EDuration="10.06573611s" podCreationTimestamp="2026-04-16 08:35:53 +0000 UTC" firstStartedPulling="2026-04-16 08:36:01.017563484 +0000 UTC m=+170.066140608" lastFinishedPulling="2026-04-16 08:36:02.500551601 +0000 UTC m=+171.549128713" observedRunningTime="2026-04-16 08:36:03.063467069 +0000 UTC m=+172.112044211" watchObservedRunningTime="2026-04-16 08:36:03.06573611 +0000 UTC m=+172.114313241" Apr 16 08:36:04.966409 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.966241 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s8nnn"] Apr 16 08:36:04.984816 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.984787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:04.987055 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.987001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 08:36:04.987189 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.987055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 08:36:04.987189 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.987060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 08:36:04.987189 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:04.987122 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ggsvg\"" Apr 16 08:36:05.029819 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.029964 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2ww\" (UniqueName: \"kubernetes.io/projected/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-kube-api-access-nr2ww\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.029964 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-root\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.029964 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029910 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-metrics-client-ca\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.029964 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.030167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.029990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-textfile\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.030167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.030056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-wtmp\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.030167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.030087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-accelerators-collector-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.030167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.030159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-sys\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.052462 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.052381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654675bfc8-v5st9" event={"ID":"729825ae-d577-487b-9eca-273b4b9c19db","Type":"ContainerStarted","Data":"6906a18cadec2745ad0b4510d6765b380e98c50ec1383cc317c03a2ffcc03357"} Apr 16 08:36:05.071795 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.071751 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-654675bfc8-v5st9" podStartSLOduration=5.330937401 podStartE2EDuration="9.071737054s" podCreationTimestamp="2026-04-16 08:35:56 +0000 UTC" firstStartedPulling="2026-04-16 08:36:01.046788157 +0000 UTC m=+170.095365271" lastFinishedPulling="2026-04-16 08:36:04.787587809 +0000 UTC m=+173.836164924" observedRunningTime="2026-04-16 08:36:05.070366142 +0000 UTC m=+174.118943272" watchObservedRunningTime="2026-04-16 08:36:05.071737054 +0000 UTC m=+174.120314184" Apr 16 08:36:05.131189 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-root\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-metrics-client-ca\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-root\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-textfile\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131540 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-wtmp\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131540 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-accelerators-collector-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131636 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-textfile\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131636 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-wtmp\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131721 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-sys\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131721 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131711 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-sys\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131812 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131909 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2ww\" (UniqueName: \"kubernetes.io/projected/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-kube-api-access-nr2ww\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.131909 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.131899 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-metrics-client-ca\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.132055 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:36:05.132009 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 08:36:05.132113 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:36:05.132095 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls podName:8dbc5caf-71de-4f52-b43e-7b8c66ccbd78 nodeName:}" failed. No retries permitted until 2026-04-16 08:36:05.632075962 +0000 UTC m=+174.680653070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls") pod "node-exporter-s8nnn" (UID: "8dbc5caf-71de-4f52-b43e-7b8c66ccbd78") : secret "node-exporter-tls" not found Apr 16 08:36:05.134450 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.134429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.141437 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.141415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-accelerators-collector-config\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.142325 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.142299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2ww\" (UniqueName: \"kubernetes.io/projected/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-kube-api-access-nr2ww\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.636368 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.636334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.638920 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.638891 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8dbc5caf-71de-4f52-b43e-7b8c66ccbd78-node-exporter-tls\") pod \"node-exporter-s8nnn\" (UID: \"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78\") " pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.896573 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:05.896495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s8nnn" Apr 16 08:36:05.906907 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:05.906877 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dbc5caf_71de_4f52_b43e_7b8c66ccbd78.slice/crio-b0acd35ca56099c14dbf8f63f624f081ee39f7cdf9d4c95d4fdc00e13b04319b WatchSource:0}: Error finding container b0acd35ca56099c14dbf8f63f624f081ee39f7cdf9d4c95d4fdc00e13b04319b: Status 404 returned error can't find the container with id b0acd35ca56099c14dbf8f63f624f081ee39f7cdf9d4c95d4fdc00e13b04319b Apr 16 08:36:06.057541 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.057505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s8nnn" event={"ID":"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78","Type":"ContainerStarted","Data":"b0acd35ca56099c14dbf8f63f624f081ee39f7cdf9d4c95d4fdc00e13b04319b"} Apr 16 08:36:06.109555 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.109523 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:36:06.136444 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.136414 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:36:06.136622 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.136600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.138774 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.138750 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 08:36:06.138964 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.138899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 08:36:06.139141 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139059 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 08:36:06.139240 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 08:36:06.139394 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139348 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 08:36:06.139394 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139387 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 08:36:06.139526 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 08:36:06.139526 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 08:36:06.139526 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139366 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 08:36:06.139846 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.139829 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-46ct4\"" Apr 16 08:36:06.242123 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-out\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242272 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242272 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242272 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-web-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-volume\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khd5p\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-kube-api-access-khd5p\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242686 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242686 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242686 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.242686 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.242559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.345709 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.345674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.345863 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.345729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-out\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.345924 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.345862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346007 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.345988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346119 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-web-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346119 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346221 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346221 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-volume\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khd5p\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-kube-api-access-khd5p\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346317 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346510 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:36:06.346375 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 08:36:06.346510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.346384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.346510 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:36:06.346458 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls podName:59674a8e-e0d2-4a74-9290-22c5a36c48b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:36:06.846434395 +0000 UTC m=+175.895011514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "59674a8e-e0d2-4a74-9290-22c5a36c48b1") : secret "alertmanager-main-tls" not found Apr 16 08:36:06.349305 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.348999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-out\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.349913 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.349607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.351827 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.351605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.352319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.352275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.352640 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.352614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-web-config\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.353558 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.353508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.353850 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.353813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-config-volume\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.354013 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.353931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.355604 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.355582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.357239 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.357198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khd5p\" (UniqueName: \"kubernetes.io/projected/59674a8e-e0d2-4a74-9290-22c5a36c48b1-kube-api-access-khd5p\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.360818 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.360772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59674a8e-e0d2-4a74-9290-22c5a36c48b1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.407107 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.406887 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:06.407107 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.406931 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:06.417281 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.417014 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:06.851269 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.851233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:06.854068 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:06.854042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59674a8e-e0d2-4a74-9290-22c5a36c48b1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59674a8e-e0d2-4a74-9290-22c5a36c48b1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:07.049983 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:07.049953 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 08:36:07.066161 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:07.066140 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:07.212300 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:07.212250 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 08:36:07.277314 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:07.277279 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59674a8e_e0d2_4a74_9290_22c5a36c48b1.slice/crio-b55eab43114a2a145f377d959736b26ddfa42cc9583c73f822c58940dcd2bfcb WatchSource:0}: Error finding container b55eab43114a2a145f377d959736b26ddfa42cc9583c73f822c58940dcd2bfcb: Status 404 returned error can't find the container with id b55eab43114a2a145f377d959736b26ddfa42cc9583c73f822c58940dcd2bfcb Apr 16 08:36:08.065933 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:08.065896 2574 generic.go:358] "Generic (PLEG): container finished" podID="8dbc5caf-71de-4f52-b43e-7b8c66ccbd78" containerID="4fabd0c4bee66a6c3de9f3d3b56491596bfdd3b5596f8934c483e572ed7977fe" exitCode=0 Apr 16 08:36:08.066142 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:08.066001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s8nnn" event={"ID":"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78","Type":"ContainerDied","Data":"4fabd0c4bee66a6c3de9f3d3b56491596bfdd3b5596f8934c483e572ed7977fe"} Apr 16 08:36:08.067590 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:08.067563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"b55eab43114a2a145f377d959736b26ddfa42cc9583c73f822c58940dcd2bfcb"} Apr 16 08:36:08.695683 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:08.695642 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" podUID="cfa1c274-a26e-42db-87a1-64a66ad0269e" containerName="registry" containerID="cri-o://38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb" gracePeriod=30 Apr 16 08:36:08.974691 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:08.974668 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:36:09.072406 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.072370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s8nnn" event={"ID":"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78","Type":"ContainerStarted","Data":"973db4fe8fedcd8ddd9f03ec5bdf5a02ba2754e741f808e5157c2b8a33569a8c"} Apr 16 08:36:09.072844 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.072413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s8nnn" event={"ID":"8dbc5caf-71de-4f52-b43e-7b8c66ccbd78","Type":"ContainerStarted","Data":"69b03d3a15a28d3916e1e5d1874f90a0682dc5ec18ff72775ebe5f568a127d1b"} Apr 16 08:36:09.072844 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.072834 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.072956 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.072887 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.072956 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.072912 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073083 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073053 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz7xq\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073144 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073097 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073144 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073138 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073249 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073172 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073249 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073241 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets\") pod \"cfa1c274-a26e-42db-87a1-64a66ad0269e\" (UID: \"cfa1c274-a26e-42db-87a1-64a66ad0269e\") " Apr 16 08:36:09.073365 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073292 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:09.073473 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073451 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-trusted-ca\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.073803 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.073771 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:09.075315 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.074381 2574 generic.go:358] "Generic (PLEG): container finished" podID="59674a8e-e0d2-4a74-9290-22c5a36c48b1" containerID="c698eb34d99c11e075537a6bea09d1c6f519d559e91c3eeef4a9b81cc372b8c8" exitCode=0 Apr 16 08:36:09.075315 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.075239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerDied","Data":"c698eb34d99c11e075537a6bea09d1c6f519d559e91c3eeef4a9b81cc372b8c8"} Apr 16 08:36:09.076345 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.076322 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:09.076345 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.076332 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:09.076345 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.076340 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:09.076532 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.076326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:09.076532 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.076376 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq" (OuterVolumeSpecName: "kube-api-access-sz7xq") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "kube-api-access-sz7xq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:09.077560 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.077533 2574 generic.go:358] "Generic (PLEG): container finished" podID="cfa1c274-a26e-42db-87a1-64a66ad0269e" containerID="38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb" exitCode=0 Apr 16 08:36:09.077673 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.077607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" event={"ID":"cfa1c274-a26e-42db-87a1-64a66ad0269e","Type":"ContainerDied","Data":"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb"} Apr 16 08:36:09.077734 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.077704 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" Apr 16 08:36:09.077789 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.077736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b888d4bcc-4hfq4" event={"ID":"cfa1c274-a26e-42db-87a1-64a66ad0269e","Type":"ContainerDied","Data":"a57533f684f7e29a17c93dc7843424d1c692452610f3939e1812840f84a07471"} Apr 16 08:36:09.077789 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.077765 2574 scope.go:117] "RemoveContainer" containerID="38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb" Apr 16 08:36:09.086192 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.086160 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cfa1c274-a26e-42db-87a1-64a66ad0269e" (UID: "cfa1c274-a26e-42db-87a1-64a66ad0269e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:36:09.087385 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.087366 2574 scope.go:117] "RemoveContainer" containerID="38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb" Apr 16 08:36:09.087680 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:36:09.087651 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb\": container with ID starting with 38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb not found: ID does not exist" containerID="38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb" Apr 16 08:36:09.087761 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.087689 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb"} err="failed to get container status \"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb\": rpc error: code = NotFound desc = could not find container \"38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb\": container with ID starting with 38ae0eb914bc0e2896ae7d4490c48f928044af25b31fb92df21b936efc78e8eb not found: ID does not exist" Apr 16 08:36:09.093462 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.093420 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s8nnn" podStartSLOduration=3.702195109 podStartE2EDuration="5.093408611s" podCreationTimestamp="2026-04-16 08:36:04 +0000 UTC" firstStartedPulling="2026-04-16 08:36:05.909086663 +0000 UTC m=+174.957663784" lastFinishedPulling="2026-04-16 08:36:07.300300171 +0000 UTC m=+176.348877286" observedRunningTime="2026-04-16 08:36:09.091923618 +0000 UTC m=+178.140500746" watchObservedRunningTime="2026-04-16 08:36:09.093408611 +0000 UTC m=+178.141985950" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174534 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sz7xq\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-kube-api-access-sz7xq\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174563 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfa1c274-a26e-42db-87a1-64a66ad0269e-ca-trust-extracted\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174580 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-image-registry-private-configuration\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174595 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-certificates\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174610 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfa1c274-a26e-42db-87a1-64a66ad0269e-installation-pull-secrets\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174625 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-registry-tls\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.174895 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.174642 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfa1c274-a26e-42db-87a1-64a66ad0269e-bound-sa-token\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:09.385703 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.385628 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-848cf94df8-z9594"] Apr 16 08:36:09.386011 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.385987 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfa1c274-a26e-42db-87a1-64a66ad0269e" containerName="registry" Apr 16 08:36:09.386011 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.386008 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa1c274-a26e-42db-87a1-64a66ad0269e" containerName="registry" Apr 16 08:36:09.386193 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.386086 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfa1c274-a26e-42db-87a1-64a66ad0269e" containerName="registry" Apr 16 08:36:09.416603 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.416577 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-848cf94df8-z9594"] Apr 16 08:36:09.416726 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.416609 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:36:09.416726 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.416624 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5b888d4bcc-4hfq4"] Apr 16 08:36:09.416814 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.416735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.419403 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 08:36:09.419512 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419435 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 08:36:09.419512 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 08:36:09.419594 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419440 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-13dllu2fmge21\"" Apr 16 08:36:09.419646 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419380 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7v675\"" Apr 16 08:36:09.419750 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.419733 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 08:36:09.477801 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-metrics-server-audit-profiles\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.477920 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477825 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-client-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.477920 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-tls\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.477920 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/506fe655-8846-49dc-9a6e-28c4bd234649-audit-log\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.478133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.478133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.477997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-client-certs\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.478133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.478043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjfh5\" (UniqueName: \"kubernetes.io/projected/506fe655-8846-49dc-9a6e-28c4bd234649-kube-api-access-vjfh5\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.546724 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.546593 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa1c274-a26e-42db-87a1-64a66ad0269e" path="/var/lib/kubelet/pods/cfa1c274-a26e-42db-87a1-64a66ad0269e/volumes" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-tls\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/506fe655-8846-49dc-9a6e-28c4bd234649-audit-log\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-client-certs\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjfh5\" (UniqueName: \"kubernetes.io/projected/506fe655-8846-49dc-9a6e-28c4bd234649-kube-api-access-vjfh5\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-metrics-server-audit-profiles\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579158 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.578740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-client-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579850 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.579823 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/506fe655-8846-49dc-9a6e-28c4bd234649-audit-log\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.579937 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.579884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.580990 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.580936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/506fe655-8846-49dc-9a6e-28c4bd234649-metrics-server-audit-profiles\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.582195 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.582171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-client-ca-bundle\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.583517 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.583471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-client-certs\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.583751 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.583733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/506fe655-8846-49dc-9a6e-28c4bd234649-secret-metrics-server-tls\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.588976 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.588952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjfh5\" (UniqueName: \"kubernetes.io/projected/506fe655-8846-49dc-9a6e-28c4bd234649-kube-api-access-vjfh5\") pod \"metrics-server-848cf94df8-z9594\" (UID: \"506fe655-8846-49dc-9a6e-28c4bd234649\") " pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.737963 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.737932 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:09.769076 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.768733 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x"] Apr 16 08:36:09.792385 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.792358 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x"] Apr 16 08:36:09.792533 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.792499 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:09.795972 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.795947 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 08:36:09.796190 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.795955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4sz5w\"" Apr 16 08:36:09.881098 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.881063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2eed6f1e-3c07-4157-a57f-7a59091a0743-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9zc2x\" (UID: \"2eed6f1e-3c07-4157-a57f-7a59091a0743\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:09.891466 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.891379 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-848cf94df8-z9594"] Apr 16 08:36:09.894713 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:09.894685 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506fe655_8846_49dc_9a6e_28c4bd234649.slice/crio-02c7638c39c4f19070ab6d50342c2fa904f601fb63d845af1405d3f7995c7b94 WatchSource:0}: Error finding container 02c7638c39c4f19070ab6d50342c2fa904f601fb63d845af1405d3f7995c7b94: Status 404 returned error can't find the container with id 02c7638c39c4f19070ab6d50342c2fa904f601fb63d845af1405d3f7995c7b94 Apr 16 08:36:09.982728 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.982380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2eed6f1e-3c07-4157-a57f-7a59091a0743-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9zc2x\" (UID: \"2eed6f1e-3c07-4157-a57f-7a59091a0743\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:09.985505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:09.985479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2eed6f1e-3c07-4157-a57f-7a59091a0743-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9zc2x\" (UID: \"2eed6f1e-3c07-4157-a57f-7a59091a0743\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:10.082493 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:10.082418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" event={"ID":"506fe655-8846-49dc-9a6e-28c4bd234649","Type":"ContainerStarted","Data":"02c7638c39c4f19070ab6d50342c2fa904f601fb63d845af1405d3f7995c7b94"} Apr 16 08:36:10.109807 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:10.109453 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:10.671245 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:10.671222 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x"] Apr 16 08:36:10.673861 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:36:10.673825 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eed6f1e_3c07_4157_a57f_7a59091a0743.slice/crio-dc7ea40c82c3d85949d13f8cd24eec424aa2d162bfc8badd2b4372502a5ea6ce WatchSource:0}: Error finding container dc7ea40c82c3d85949d13f8cd24eec424aa2d162bfc8badd2b4372502a5ea6ce: Status 404 returned error can't find the container with id dc7ea40c82c3d85949d13f8cd24eec424aa2d162bfc8badd2b4372502a5ea6ce Apr 16 08:36:11.092061 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:11.091988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"209b5da1b929470fe9fd205dff1c2bae2f06c3b2b71cd72854e5ba240804db8b"} Apr 16 08:36:11.096519 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:11.096462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" event={"ID":"2eed6f1e-3c07-4157-a57f-7a59091a0743","Type":"ContainerStarted","Data":"dc7ea40c82c3d85949d13f8cd24eec424aa2d162bfc8badd2b4372502a5ea6ce"} Apr 16 08:36:12.044769 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:12.044735 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zc2r7" Apr 16 08:36:12.103338 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:12.103305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"6bef693eb5e71c826e1cb44c862a2556489ff469c0aa2ea83d6a617da9cf352e"} Apr 16 08:36:12.103338 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:12.103342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"382eafb206b19cce62a9f2bcf76f07ec5f80dbde04887bd78b6613edca15a36d"} Apr 16 08:36:12.103837 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:12.103351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"31f747c380ed8cf947558d1c242b01fe377cdeb105547d3b1e4562790d2617ac"} Apr 16 08:36:12.103837 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:12.103360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"e6b5815229efbe1fb1a82a31fd0fcb8c2ad37379ce508f4dceb245c7baca0f70"} Apr 16 08:36:13.108723 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:13.108673 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" event={"ID":"506fe655-8846-49dc-9a6e-28c4bd234649","Type":"ContainerStarted","Data":"b5cfd3935734ae7f1c0f23441f4ef71cf298edd715f0d2c831365cccb0b187cb"} Apr 16 08:36:14.114689 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.114658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59674a8e-e0d2-4a74-9290-22c5a36c48b1","Type":"ContainerStarted","Data":"a55b6ad6bbe8b5da62591a8a95c7bc9febee5e295aa2df159e0cf1f61dc41062"} Apr 16 08:36:14.116000 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.115972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" event={"ID":"2eed6f1e-3c07-4157-a57f-7a59091a0743","Type":"ContainerStarted","Data":"1c4731739b70fe2bcf3c949d8f6f7062d2347b647b9cf674fee6f04b71357510"} Apr 16 08:36:14.116154 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.116135 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:14.120574 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.120556 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" Apr 16 08:36:14.144326 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.144252 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.816176252 podStartE2EDuration="8.144236695s" podCreationTimestamp="2026-04-16 08:36:06 +0000 UTC" firstStartedPulling="2026-04-16 08:36:07.296795841 +0000 UTC m=+176.345372955" lastFinishedPulling="2026-04-16 08:36:13.624856286 +0000 UTC m=+182.673433398" observedRunningTime="2026-04-16 08:36:14.142871637 +0000 UTC m=+183.191448796" watchObservedRunningTime="2026-04-16 08:36:14.144236695 +0000 UTC m=+183.192813824" Apr 16 08:36:14.164896 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.164412 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9zc2x" podStartSLOduration=2.838739135 podStartE2EDuration="5.164395822s" podCreationTimestamp="2026-04-16 08:36:09 +0000 UTC" firstStartedPulling="2026-04-16 08:36:10.67614071 +0000 UTC m=+179.724717817" lastFinishedPulling="2026-04-16 08:36:13.001797394 +0000 UTC m=+182.050374504" observedRunningTime="2026-04-16 08:36:14.163966202 +0000 UTC m=+183.212543332" watchObservedRunningTime="2026-04-16 08:36:14.164395822 +0000 UTC m=+183.212972952" Apr 16 08:36:14.181543 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:14.181506 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" podStartSLOduration=2.079662693 podStartE2EDuration="5.181492908s" podCreationTimestamp="2026-04-16 08:36:09 +0000 UTC" firstStartedPulling="2026-04-16 08:36:09.897305396 +0000 UTC m=+178.945882516" lastFinishedPulling="2026-04-16 08:36:12.999135613 +0000 UTC m=+182.047712731" observedRunningTime="2026-04-16 08:36:14.180575073 +0000 UTC m=+183.229152203" watchObservedRunningTime="2026-04-16 08:36:14.181492908 +0000 UTC m=+183.230070037" Apr 16 08:36:20.988224 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:20.988189 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:36:29.738191 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:29.738164 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:29.738191 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:29.738197 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:45.201139 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.201105 2574 generic.go:358] "Generic (PLEG): container finished" podID="3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb" containerID="8b3565b25ed6c0913dfa1eeb7d432fd10bc5000e2c7280d26b96cdca7efffb35" exitCode=0 Apr 16 08:36:45.201580 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.201173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" event={"ID":"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb","Type":"ContainerDied","Data":"8b3565b25ed6c0913dfa1eeb7d432fd10bc5000e2c7280d26b96cdca7efffb35"} Apr 16 08:36:45.201580 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.201555 2574 scope.go:117] "RemoveContainer" containerID="8b3565b25ed6c0913dfa1eeb7d432fd10bc5000e2c7280d26b96cdca7efffb35" Apr 16 08:36:45.202510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.202491 2574 generic.go:358] "Generic (PLEG): container finished" podID="d76203c8-22cc-48c0-a9be-e10030af2601" containerID="4bb58c07935594ac13392753b52a49624b2caf838eb3d581e73ba2c3b1cf8728" exitCode=0 Apr 16 08:36:45.202568 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.202544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" event={"ID":"d76203c8-22cc-48c0-a9be-e10030af2601","Type":"ContainerDied","Data":"4bb58c07935594ac13392753b52a49624b2caf838eb3d581e73ba2c3b1cf8728"} Apr 16 08:36:45.202796 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:45.202786 2574 scope.go:117] "RemoveContainer" containerID="4bb58c07935594ac13392753b52a49624b2caf838eb3d581e73ba2c3b1cf8728" Apr 16 08:36:46.007253 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.007193 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-654675bfc8-v5st9" podUID="729825ae-d577-487b-9eca-273b4b9c19db" containerName="console" containerID="cri-o://6906a18cadec2745ad0b4510d6765b380e98c50ec1383cc317c03a2ffcc03357" gracePeriod=15 Apr 16 08:36:46.207682 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.207649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zngbr" event={"ID":"3d23fa67-49e4-407a-b6b8-4bd17f8a1bfb","Type":"ContainerStarted","Data":"829f79e9859b98e66ba843c41d34af50961b52a1e6de0527207c571cc6e7699e"} Apr 16 08:36:46.209479 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.209452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-f5z6f" event={"ID":"d76203c8-22cc-48c0-a9be-e10030af2601","Type":"ContainerStarted","Data":"568e559c30516df04597dc4edbab9edb7221905c95f57d7b227f7ed8aff44bb5"} Apr 16 08:36:46.210955 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.210938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654675bfc8-v5st9_729825ae-d577-487b-9eca-273b4b9c19db/console/0.log" Apr 16 08:36:46.211083 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.210968 2574 generic.go:358] "Generic (PLEG): container finished" podID="729825ae-d577-487b-9eca-273b4b9c19db" containerID="6906a18cadec2745ad0b4510d6765b380e98c50ec1383cc317c03a2ffcc03357" exitCode=2 Apr 16 08:36:46.211083 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.210989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654675bfc8-v5st9" event={"ID":"729825ae-d577-487b-9eca-273b4b9c19db","Type":"ContainerDied","Data":"6906a18cadec2745ad0b4510d6765b380e98c50ec1383cc317c03a2ffcc03357"} Apr 16 08:36:46.278956 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.278936 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654675bfc8-v5st9_729825ae-d577-487b-9eca-273b4b9c19db/console/0.log" Apr 16 08:36:46.279097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.278995 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:46.468727 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468695 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468760 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468782 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468801 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468819 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk7l8\" (UniqueName: \"kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468838 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.468889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.468853 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert\") pod \"729825ae-d577-487b-9eca-273b4b9c19db\" (UID: \"729825ae-d577-487b-9eca-273b4b9c19db\") " Apr 16 08:36:46.469500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.469263 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca" (OuterVolumeSpecName: "service-ca") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:46.469500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.469315 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config" (OuterVolumeSpecName: "console-config") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:46.469500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.469383 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:46.469500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.469379 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:46.471330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.471306 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8" (OuterVolumeSpecName: "kube-api-access-qk7l8") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "kube-api-access-qk7l8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:46.471330 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.471309 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:46.471460 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.471333 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "729825ae-d577-487b-9eca-273b4b9c19db" (UID: "729825ae-d577-487b-9eca-273b4b9c19db"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570140 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-service-ca\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570168 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-trusted-ca-bundle\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570179 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-console-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570188 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qk7l8\" (UniqueName: \"kubernetes.io/projected/729825ae-d577-487b-9eca-273b4b9c19db-kube-api-access-qk7l8\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570198 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570205 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570207 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/729825ae-d577-487b-9eca-273b4b9c19db-oauth-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:46.570474 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:46.570218 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/729825ae-d577-487b-9eca-273b4b9c19db-console-oauth-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:36:47.218364 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.218329 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-654675bfc8-v5st9_729825ae-d577-487b-9eca-273b4b9c19db/console/0.log" Apr 16 08:36:47.218819 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.218410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-654675bfc8-v5st9" event={"ID":"729825ae-d577-487b-9eca-273b4b9c19db","Type":"ContainerDied","Data":"3c41ed9d3bbc497aca95021bbe0e8d3dcfafc58782e2db500b4a4e3327a0e6b8"} Apr 16 08:36:47.218819 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.218448 2574 scope.go:117] "RemoveContainer" containerID="6906a18cadec2745ad0b4510d6765b380e98c50ec1383cc317c03a2ffcc03357" Apr 16 08:36:47.218819 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.218475 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-654675bfc8-v5st9" Apr 16 08:36:47.241491 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.241461 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:36:47.246047 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.246010 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-654675bfc8-v5st9"] Apr 16 08:36:47.539412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:47.539339 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="729825ae-d577-487b-9eca-273b4b9c19db" path="/var/lib/kubelet/pods/729825ae-d577-487b-9eca-273b4b9c19db/volumes" Apr 16 08:36:49.743887 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:49.743810 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:49.747852 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:49.747825 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-848cf94df8-z9594" Apr 16 08:36:50.229087 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:50.229057 2574 generic.go:358] "Generic (PLEG): container finished" podID="feaae171-21c4-4aed-973a-8bfcf22b6913" containerID="30e30e045d63d1a7648a917e29458433b96965883980d4b644a4153f8c7ff0da" exitCode=0 Apr 16 08:36:50.229368 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:50.229138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" event={"ID":"feaae171-21c4-4aed-973a-8bfcf22b6913","Type":"ContainerDied","Data":"30e30e045d63d1a7648a917e29458433b96965883980d4b644a4153f8c7ff0da"} Apr 16 08:36:50.229645 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:50.229624 2574 scope.go:117] "RemoveContainer" containerID="30e30e045d63d1a7648a917e29458433b96965883980d4b644a4153f8c7ff0da" Apr 16 08:36:51.233145 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:36:51.233111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-psv25" event={"ID":"feaae171-21c4-4aed-973a-8bfcf22b6913","Type":"ContainerStarted","Data":"8bca3c69b6b5073903f204c35b06ed55d1578c8f1e5c1b0a5a739c78c04c7f62"} Apr 16 08:37:23.350606 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:23.350575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:37:23.352787 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:23.352769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1290b06-222c-45ae-985a-c88370488114-metrics-certs\") pod \"network-metrics-daemon-r9fn4\" (UID: \"b1290b06-222c-45ae-985a-c88370488114\") " pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:37:23.538004 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:23.537978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:37:23.545905 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:23.545879 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9fn4" Apr 16 08:37:23.666094 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:23.666049 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r9fn4"] Apr 16 08:37:23.673782 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:37:23.673756 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1290b06_222c_45ae_985a_c88370488114.slice/crio-aa50a3866c8ab21c5c7149cec125ba7c274fe6a926cbf0b449d3c3ddc8cdb055 WatchSource:0}: Error finding container aa50a3866c8ab21c5c7149cec125ba7c274fe6a926cbf0b449d3c3ddc8cdb055: Status 404 returned error can't find the container with id aa50a3866c8ab21c5c7149cec125ba7c274fe6a926cbf0b449d3c3ddc8cdb055 Apr 16 08:37:24.327626 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:24.327129 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9fn4" event={"ID":"b1290b06-222c-45ae-985a-c88370488114","Type":"ContainerStarted","Data":"aa50a3866c8ab21c5c7149cec125ba7c274fe6a926cbf0b449d3c3ddc8cdb055"} Apr 16 08:37:25.331108 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:25.331072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9fn4" event={"ID":"b1290b06-222c-45ae-985a-c88370488114","Type":"ContainerStarted","Data":"804ee6f399695815c2a0bb45e47af459e608515e6c5e091f0dd9c62579d3e8aa"} Apr 16 08:37:25.331108 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:25.331108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9fn4" event={"ID":"b1290b06-222c-45ae-985a-c88370488114","Type":"ContainerStarted","Data":"ad92652f757222ed57c0092c13f32d50cd0b745c912d3287761a2bc11c921b3d"} Apr 16 08:37:25.349555 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:25.349512 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r9fn4" podStartSLOduration=253.275531517 podStartE2EDuration="4m14.349498229s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:37:23.675350708 +0000 UTC m=+252.723927815" lastFinishedPulling="2026-04-16 08:37:24.749317416 +0000 UTC m=+253.797894527" observedRunningTime="2026-04-16 08:37:25.34801558 +0000 UTC m=+254.396592709" watchObservedRunningTime="2026-04-16 08:37:25.349498229 +0000 UTC m=+254.398075358" Apr 16 08:37:29.313870 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.313782 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b684bb9db-s7xtw"] Apr 16 08:37:29.314319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.314198 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="729825ae-d577-487b-9eca-273b4b9c19db" containerName="console" Apr 16 08:37:29.314319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.314213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="729825ae-d577-487b-9eca-273b4b9c19db" containerName="console" Apr 16 08:37:29.314319 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.314301 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="729825ae-d577-487b-9eca-273b4b9c19db" containerName="console" Apr 16 08:37:29.317609 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.317590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.320246 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 08:37:29.320246 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 08:37:29.320409 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 08:37:29.320409 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 08:37:29.320409 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 08:37:29.320409 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.320337 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hmmr2\"" Apr 16 08:37:29.330876 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.330852 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 08:37:29.331949 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.331390 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b684bb9db-s7xtw"] Apr 16 08:37:29.394568 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-metrics-client-ca\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394737 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394737 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394604 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394737 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394737 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-federate-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394892 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394753 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-telemeter-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394892 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n947f\" (UniqueName: \"kubernetes.io/projected/5e19740e-e897-4983-831e-38a5090e217f-kube-api-access-n947f\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.394892 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.394793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-serving-certs-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.495926 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.495891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-metrics-client-ca\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.495948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.495980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496133 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496309 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-federate-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496309 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-telemeter-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496309 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n947f\" (UniqueName: \"kubernetes.io/projected/5e19740e-e897-4983-831e-38a5090e217f-kube-api-access-n947f\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496309 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-serving-certs-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496829 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-metrics-client-ca\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.496829 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.497008 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.496968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e19740e-e897-4983-831e-38a5090e217f-serving-certs-ca-bundle\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.498600 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.498559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.498600 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.498585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.499014 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.498995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-federate-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.499014 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.499007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5e19740e-e897-4983-831e-38a5090e217f-telemeter-client-tls\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.506544 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.506524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n947f\" (UniqueName: \"kubernetes.io/projected/5e19740e-e897-4983-831e-38a5090e217f-kube-api-access-n947f\") pod \"telemeter-client-5b684bb9db-s7xtw\" (UID: \"5e19740e-e897-4983-831e-38a5090e217f\") " pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.634959 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.634885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" Apr 16 08:37:29.757402 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:29.757380 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b684bb9db-s7xtw"] Apr 16 08:37:29.759258 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:37:29.759231 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e19740e_e897_4983_831e_38a5090e217f.slice/crio-47e67e91fb58cbabce2899043b008bdd54830c0a56e269081cae15d457136350 WatchSource:0}: Error finding container 47e67e91fb58cbabce2899043b008bdd54830c0a56e269081cae15d457136350: Status 404 returned error can't find the container with id 47e67e91fb58cbabce2899043b008bdd54830c0a56e269081cae15d457136350 Apr 16 08:37:30.345829 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:30.345782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" event={"ID":"5e19740e-e897-4983-831e-38a5090e217f","Type":"ContainerStarted","Data":"47e67e91fb58cbabce2899043b008bdd54830c0a56e269081cae15d457136350"} Apr 16 08:37:31.351374 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:31.351308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" event={"ID":"5e19740e-e897-4983-831e-38a5090e217f","Type":"ContainerStarted","Data":"5193abdd5aa5a52ddb42636f17dbe6300714af31cdd8400001ca62af3851b977"} Apr 16 08:37:31.351374 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:31.351351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" event={"ID":"5e19740e-e897-4983-831e-38a5090e217f","Type":"ContainerStarted","Data":"1853785bb30c6244b905ed1f92d049649265d719dc0577ef3a5e7f52e5d55584"} Apr 16 08:37:32.355836 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:32.355803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" event={"ID":"5e19740e-e897-4983-831e-38a5090e217f","Type":"ContainerStarted","Data":"c904588626c52d23bfe7ceddeaabc45b36666302ede50ea7808300bc3c20d558"} Apr 16 08:37:32.379702 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:32.379661 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b684bb9db-s7xtw" podStartSLOduration=1.9455179390000001 podStartE2EDuration="3.379647752s" podCreationTimestamp="2026-04-16 08:37:29 +0000 UTC" firstStartedPulling="2026-04-16 08:37:29.761137706 +0000 UTC m=+258.809714812" lastFinishedPulling="2026-04-16 08:37:31.195267504 +0000 UTC m=+260.243844625" observedRunningTime="2026-04-16 08:37:32.378485492 +0000 UTC m=+261.427062620" watchObservedRunningTime="2026-04-16 08:37:32.379647752 +0000 UTC m=+261.428224889" Apr 16 08:37:33.588640 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.588611 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:37:33.591908 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.591885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.595153 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.595124 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 08:37:33.596078 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.596050 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 08:37:33.596078 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.596067 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 08:37:33.596223 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.596138 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 08:37:33.596223 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.596066 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jcp87\"" Apr 16 08:37:33.596380 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.596366 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 08:37:33.601506 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.601485 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 08:37:33.603490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.603471 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:37:33.735067 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735227 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735227 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-948j8\" (UniqueName: \"kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735227 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735227 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735365 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.735365 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.735256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.835966 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.835935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-948j8\" (UniqueName: \"kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.835974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.835995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836136 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836735 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.836851 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.836815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.837202 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.837179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.838508 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.838486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.838669 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.838627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.845155 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.845131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-948j8\" (UniqueName: \"kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8\") pod \"console-7b96f66ccc-l2nqp\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:33.904211 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:33.904189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:34.025250 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:34.025225 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:37:34.027669 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:37:34.027641 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e184b38_8909_4624_a49e_40ec034331e9.slice/crio-17aaec6458b2f9bd495fb35a47e4c14c934faab8a9449543fabe7de0219573dc WatchSource:0}: Error finding container 17aaec6458b2f9bd495fb35a47e4c14c934faab8a9449543fabe7de0219573dc: Status 404 returned error can't find the container with id 17aaec6458b2f9bd495fb35a47e4c14c934faab8a9449543fabe7de0219573dc Apr 16 08:37:34.362792 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:34.362757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b96f66ccc-l2nqp" event={"ID":"5e184b38-8909-4624-a49e-40ec034331e9","Type":"ContainerStarted","Data":"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a"} Apr 16 08:37:34.362792 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:34.362792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b96f66ccc-l2nqp" event={"ID":"5e184b38-8909-4624-a49e-40ec034331e9","Type":"ContainerStarted","Data":"17aaec6458b2f9bd495fb35a47e4c14c934faab8a9449543fabe7de0219573dc"} Apr 16 08:37:34.382256 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:34.382210 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b96f66ccc-l2nqp" podStartSLOduration=1.382195086 podStartE2EDuration="1.382195086s" podCreationTimestamp="2026-04-16 08:37:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:37:34.38117966 +0000 UTC m=+263.429756803" watchObservedRunningTime="2026-04-16 08:37:34.382195086 +0000 UTC m=+263.430772216" Apr 16 08:37:41.191425 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.191390 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:37:41.224590 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.224565 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:37:41.230387 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.230367 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.238976 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.238953 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:37:41.391947 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.391919 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392089 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.391954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392089 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.391972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392089 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.391988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cllnl\" (UniqueName: \"kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392207 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.392083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392207 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.392114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.392207 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.392184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493279 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493279 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cllnl\" (UniqueName: \"kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493412 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.493581 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.493426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.498042 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.494253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.498042 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.494456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.498042 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.494457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.498042 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.494488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.498042 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.496872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.499978 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.499952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.503657 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.503635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cllnl\" (UniqueName: \"kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl\") pod \"console-7cffff7fdf-nhfpr\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.540167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.540143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:41.677519 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:41.677495 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:37:41.679745 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:37:41.679705 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49613c2_4be5_4228_968e_23cb0ef3c4a0.slice/crio-4157d8b788857e272ed0b6b34ea40e44833e734f2f550a190220f3df526a6b3d WatchSource:0}: Error finding container 4157d8b788857e272ed0b6b34ea40e44833e734f2f550a190220f3df526a6b3d: Status 404 returned error can't find the container with id 4157d8b788857e272ed0b6b34ea40e44833e734f2f550a190220f3df526a6b3d Apr 16 08:37:42.385387 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:42.385353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cffff7fdf-nhfpr" event={"ID":"c49613c2-4be5-4228-968e-23cb0ef3c4a0","Type":"ContainerStarted","Data":"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e"} Apr 16 08:37:42.385387 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:42.385388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cffff7fdf-nhfpr" event={"ID":"c49613c2-4be5-4228-968e-23cb0ef3c4a0","Type":"ContainerStarted","Data":"4157d8b788857e272ed0b6b34ea40e44833e734f2f550a190220f3df526a6b3d"} Apr 16 08:37:42.405211 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:42.405166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cffff7fdf-nhfpr" podStartSLOduration=1.405153404 podStartE2EDuration="1.405153404s" podCreationTimestamp="2026-04-16 08:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:37:42.403611249 +0000 UTC m=+271.452188378" watchObservedRunningTime="2026-04-16 08:37:42.405153404 +0000 UTC m=+271.453730534" Apr 16 08:37:43.904683 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:43.904652 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:37:51.541224 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:51.541193 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:51.541224 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:51.541228 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:51.545821 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:51.545799 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:37:52.417603 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:37:52.417574 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:38:06.209969 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.209903 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b96f66ccc-l2nqp" podUID="5e184b38-8909-4624-a49e-40ec034331e9" containerName="console" containerID="cri-o://93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a" gracePeriod=15 Apr 16 08:38:06.438505 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.438485 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b96f66ccc-l2nqp_5e184b38-8909-4624-a49e-40ec034331e9/console/0.log" Apr 16 08:38:06.438612 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.438546 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:38:06.462938 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.462887 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b96f66ccc-l2nqp_5e184b38-8909-4624-a49e-40ec034331e9/console/0.log" Apr 16 08:38:06.462938 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.462922 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e184b38-8909-4624-a49e-40ec034331e9" containerID="93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a" exitCode=2 Apr 16 08:38:06.463124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.462947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b96f66ccc-l2nqp" event={"ID":"5e184b38-8909-4624-a49e-40ec034331e9","Type":"ContainerDied","Data":"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a"} Apr 16 08:38:06.463124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.462969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b96f66ccc-l2nqp" event={"ID":"5e184b38-8909-4624-a49e-40ec034331e9","Type":"ContainerDied","Data":"17aaec6458b2f9bd495fb35a47e4c14c934faab8a9449543fabe7de0219573dc"} Apr 16 08:38:06.463124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.462984 2574 scope.go:117] "RemoveContainer" containerID="93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a" Apr 16 08:38:06.463124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.463007 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b96f66ccc-l2nqp" Apr 16 08:38:06.471751 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.471731 2574 scope.go:117] "RemoveContainer" containerID="93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a" Apr 16 08:38:06.471986 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:38:06.471967 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a\": container with ID starting with 93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a not found: ID does not exist" containerID="93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a" Apr 16 08:38:06.472080 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.472004 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a"} err="failed to get container status \"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a\": rpc error: code = NotFound desc = could not find container \"93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a\": container with ID starting with 93819129d07f19d104ab3b3e864d045fff8e1bd3f3fb4643c985ba5e3b56839a not found: ID does not exist" Apr 16 08:38:06.490656 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490635 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.490750 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490669 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.490750 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490704 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.490891 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490874 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.490944 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490921 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.491000 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.490957 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.491097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491002 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-948j8\" (UniqueName: \"kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8\") pod \"5e184b38-8909-4624-a49e-40ec034331e9\" (UID: \"5e184b38-8909-4624-a49e-40ec034331e9\") " Apr 16 08:38:06.491097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491043 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config" (OuterVolumeSpecName: "console-config") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:06.491097 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491052 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca" (OuterVolumeSpecName: "service-ca") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:06.491267 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491249 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-console-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.491332 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491271 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-service-ca\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.491458 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491437 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:06.491514 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.491493 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:06.492758 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.492737 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:38:06.493085 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.493016 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8" (OuterVolumeSpecName: "kube-api-access-948j8") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "kube-api-access-948j8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:38:06.493169 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.493154 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5e184b38-8909-4624-a49e-40ec034331e9" (UID: "5e184b38-8909-4624-a49e-40ec034331e9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:38:06.592576 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.592553 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.592576 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.592573 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-trusted-ca-bundle\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.592695 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.592582 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e184b38-8909-4624-a49e-40ec034331e9-oauth-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.592695 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.592591 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-948j8\" (UniqueName: \"kubernetes.io/projected/5e184b38-8909-4624-a49e-40ec034331e9-kube-api-access-948j8\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.592695 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.592601 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e184b38-8909-4624-a49e-40ec034331e9-console-oauth-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:38:06.785436 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.785403 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:38:06.792303 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:06.792283 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b96f66ccc-l2nqp"] Apr 16 08:38:07.538642 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:07.538613 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e184b38-8909-4624-a49e-40ec034331e9" path="/var/lib/kubelet/pods/5e184b38-8909-4624-a49e-40ec034331e9/volumes" Apr 16 08:38:11.400731 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:11.400704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:38:11.402309 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:11.402287 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:38:11.409059 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:11.409037 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 08:38:21.730396 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.730361 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-95szv"] Apr 16 08:38:21.733072 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.730660 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e184b38-8909-4624-a49e-40ec034331e9" containerName="console" Apr 16 08:38:21.733072 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.730675 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e184b38-8909-4624-a49e-40ec034331e9" containerName="console" Apr 16 08:38:21.733072 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.730725 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e184b38-8909-4624-a49e-40ec034331e9" containerName="console" Apr 16 08:38:21.733956 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.733941 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.736111 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.736091 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 08:38:21.741084 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.740688 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-95szv"] Apr 16 08:38:21.802691 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.802661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a800b20-6dc5-4861-9dc2-f65c151011c7-original-pull-secret\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.802804 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.802753 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-kubelet-config\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.802804 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.802782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-dbus\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.903195 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.903156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a800b20-6dc5-4861-9dc2-f65c151011c7-original-pull-secret\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.903326 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.903249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-kubelet-config\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.903326 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.903268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-dbus\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.903395 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.903362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-kubelet-config\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.903444 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.903431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a800b20-6dc5-4861-9dc2-f65c151011c7-dbus\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:21.905415 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:21.905390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a800b20-6dc5-4861-9dc2-f65c151011c7-original-pull-secret\") pod \"global-pull-secret-syncer-95szv\" (UID: \"7a800b20-6dc5-4861-9dc2-f65c151011c7\") " pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:22.043478 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:22.043414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-95szv" Apr 16 08:38:22.162199 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:22.162175 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-95szv"] Apr 16 08:38:22.164583 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:38:22.164560 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a800b20_6dc5_4861_9dc2_f65c151011c7.slice/crio-399d73ac040cd9bc9efa5affb3ff1d9ad8452db82382b175b4fe8b71565bb47c WatchSource:0}: Error finding container 399d73ac040cd9bc9efa5affb3ff1d9ad8452db82382b175b4fe8b71565bb47c: Status 404 returned error can't find the container with id 399d73ac040cd9bc9efa5affb3ff1d9ad8452db82382b175b4fe8b71565bb47c Apr 16 08:38:22.166271 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:22.166255 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:38:22.509223 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:22.509184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-95szv" event={"ID":"7a800b20-6dc5-4861-9dc2-f65c151011c7","Type":"ContainerStarted","Data":"399d73ac040cd9bc9efa5affb3ff1d9ad8452db82382b175b4fe8b71565bb47c"} Apr 16 08:38:26.525949 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:26.525914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-95szv" event={"ID":"7a800b20-6dc5-4861-9dc2-f65c151011c7","Type":"ContainerStarted","Data":"040548aef24d0e02de3e16f9878a74440f3512f16d55261fff24e22c35b8de8b"} Apr 16 08:38:26.542877 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:26.542834 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-95szv" podStartSLOduration=1.905746927 podStartE2EDuration="5.542820783s" podCreationTimestamp="2026-04-16 08:38:21 +0000 UTC" firstStartedPulling="2026-04-16 08:38:22.166383771 +0000 UTC m=+311.214960879" lastFinishedPulling="2026-04-16 08:38:25.803457628 +0000 UTC m=+314.852034735" observedRunningTime="2026-04-16 08:38:26.541924903 +0000 UTC m=+315.590502031" watchObservedRunningTime="2026-04-16 08:38:26.542820783 +0000 UTC m=+315.591397911" Apr 16 08:38:46.599239 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.599200 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm"] Apr 16 08:38:46.601639 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.601618 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.604618 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.604599 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 08:38:46.604714 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.604632 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 08:38:46.604714 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.604663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 08:38:46.604714 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.604633 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pbpnl\"" Apr 16 08:38:46.604878 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.604633 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 08:38:46.610822 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.610803 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm"] Apr 16 08:38:46.696919 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.696893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmksz\" (UniqueName: \"kubernetes.io/projected/1d75edc8-7c06-4441-ad34-efae3b97eb93-kube-api-access-qmksz\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.697075 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.697041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d75edc8-7c06-4441-ad34-efae3b97eb93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.798247 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.798213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d75edc8-7c06-4441-ad34-efae3b97eb93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.798401 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.798265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmksz\" (UniqueName: \"kubernetes.io/projected/1d75edc8-7c06-4441-ad34-efae3b97eb93-kube-api-access-qmksz\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.800717 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.800697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d75edc8-7c06-4441-ad34-efae3b97eb93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.807692 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.807667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmksz\" (UniqueName: \"kubernetes.io/projected/1d75edc8-7c06-4441-ad34-efae3b97eb93-kube-api-access-qmksz\") pod \"managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm\" (UID: \"1d75edc8-7c06-4441-ad34-efae3b97eb93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:46.923929 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:46.923905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" Apr 16 08:38:47.041120 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:47.041088 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm"] Apr 16 08:38:47.044218 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:38:47.044186 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d75edc8_7c06_4441_ad34_efae3b97eb93.slice/crio-ed9b4b1d619d978261ff1a68b326b2bc1626103a9d15dcf3ff0222a2fc242d2f WatchSource:0}: Error finding container ed9b4b1d619d978261ff1a68b326b2bc1626103a9d15dcf3ff0222a2fc242d2f: Status 404 returned error can't find the container with id ed9b4b1d619d978261ff1a68b326b2bc1626103a9d15dcf3ff0222a2fc242d2f Apr 16 08:38:47.589605 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:47.589567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" event={"ID":"1d75edc8-7c06-4441-ad34-efae3b97eb93","Type":"ContainerStarted","Data":"ed9b4b1d619d978261ff1a68b326b2bc1626103a9d15dcf3ff0222a2fc242d2f"} Apr 16 08:38:49.597121 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:49.597044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" event={"ID":"1d75edc8-7c06-4441-ad34-efae3b97eb93","Type":"ContainerStarted","Data":"930b52784c3cc26c3a82f2e0ed140c97ecebe7254ea8f858ca75f54024298d50"} Apr 16 08:38:49.612768 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:38:49.612724 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7c84cf54dc-d8zbm" podStartSLOduration=1.339158966 podStartE2EDuration="3.612710982s" podCreationTimestamp="2026-04-16 08:38:46 +0000 UTC" firstStartedPulling="2026-04-16 08:38:47.046571575 +0000 UTC m=+336.095148682" lastFinishedPulling="2026-04-16 08:38:49.320123588 +0000 UTC m=+338.368700698" observedRunningTime="2026-04-16 08:38:49.612063903 +0000 UTC m=+338.660641032" watchObservedRunningTime="2026-04-16 08:38:49.612710982 +0000 UTC m=+338.661288129" Apr 16 08:39:45.388545 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.388513 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-krfk5"] Apr 16 08:39:45.391720 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.391703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.394070 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.394045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 08:39:45.394736 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.394677 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-wzhhp\"" Apr 16 08:39:45.394824 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.394710 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 08:39:45.396810 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.396789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-krfk5"] Apr 16 08:39:45.544709 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.544684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.544865 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.544740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grr4w\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-kube-api-access-grr4w\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.645921 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.645847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.646078 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.645923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grr4w\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-kube-api-access-grr4w\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.655641 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.655618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.655875 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.655852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grr4w\" (UniqueName: \"kubernetes.io/projected/2e3bac97-f36b-418d-8856-6dc765b99419-kube-api-access-grr4w\") pod \"cert-manager-cainjector-8966b78d4-krfk5\" (UID: \"2e3bac97-f36b-418d-8856-6dc765b99419\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.702035 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.702003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" Apr 16 08:39:45.821431 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:45.821401 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-krfk5"] Apr 16 08:39:45.824909 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:39:45.824879 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e3bac97_f36b_418d_8856_6dc765b99419.slice/crio-7e97d6da40369dc43562b77cf6f8fb432b0cbbf1dc6370baf7798c9d97b755be WatchSource:0}: Error finding container 7e97d6da40369dc43562b77cf6f8fb432b0cbbf1dc6370baf7798c9d97b755be: Status 404 returned error can't find the container with id 7e97d6da40369dc43562b77cf6f8fb432b0cbbf1dc6370baf7798c9d97b755be Apr 16 08:39:46.762083 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:46.762036 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" event={"ID":"2e3bac97-f36b-418d-8856-6dc765b99419","Type":"ContainerStarted","Data":"7e97d6da40369dc43562b77cf6f8fb432b0cbbf1dc6370baf7798c9d97b755be"} Apr 16 08:39:49.772623 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:49.772589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" event={"ID":"2e3bac97-f36b-418d-8856-6dc765b99419","Type":"ContainerStarted","Data":"446d30f27afc2a4dd3466411f040dcdc37c3d294197869aa5be17713da6f3b12"} Apr 16 08:39:49.791124 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:39:49.791077 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-krfk5" podStartSLOduration=1.549469437 podStartE2EDuration="4.791064088s" podCreationTimestamp="2026-04-16 08:39:45 +0000 UTC" firstStartedPulling="2026-04-16 08:39:45.826913229 +0000 UTC m=+394.875490336" lastFinishedPulling="2026-04-16 08:39:49.068507876 +0000 UTC m=+398.117084987" observedRunningTime="2026-04-16 08:39:49.789701515 +0000 UTC m=+398.838278656" watchObservedRunningTime="2026-04-16 08:39:49.791064088 +0000 UTC m=+398.839641217" Apr 16 08:40:02.825573 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.825537 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-qpdgv"] Apr 16 08:40:02.831933 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.831911 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:02.834373 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.834352 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-56gwf\"" Apr 16 08:40:02.839857 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.839838 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-qpdgv"] Apr 16 08:40:02.978533 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.978499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxqq\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-kube-api-access-2lxqq\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:02.978655 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:02.978546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-bound-sa-token\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.079972 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.079899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxqq\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-kube-api-access-2lxqq\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.079972 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.079955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-bound-sa-token\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.092668 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.092640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-bound-sa-token\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.092821 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.092801 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxqq\" (UniqueName: \"kubernetes.io/projected/e6f2f233-ba88-410b-ae6b-45e81abc22f9-kube-api-access-2lxqq\") pod \"cert-manager-759f64656b-qpdgv\" (UID: \"e6f2f233-ba88-410b-ae6b-45e81abc22f9\") " pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.141584 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.141563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-qpdgv" Apr 16 08:40:03.259235 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.259194 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-qpdgv"] Apr 16 08:40:03.261262 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:40:03.261233 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f2f233_ba88_410b_ae6b_45e81abc22f9.slice/crio-1b126219341c112fb777778469716581bdd0eeeeca6074b1cf506fee6d323cea WatchSource:0}: Error finding container 1b126219341c112fb777778469716581bdd0eeeeca6074b1cf506fee6d323cea: Status 404 returned error can't find the container with id 1b126219341c112fb777778469716581bdd0eeeeca6074b1cf506fee6d323cea Apr 16 08:40:03.814379 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.814340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-qpdgv" event={"ID":"e6f2f233-ba88-410b-ae6b-45e81abc22f9","Type":"ContainerStarted","Data":"5c2c259753fa0566a93fec50c7c8a32cbba44fdd8bf8e70077b3e0bd88c1848e"} Apr 16 08:40:03.814379 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.814381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-qpdgv" event={"ID":"e6f2f233-ba88-410b-ae6b-45e81abc22f9","Type":"ContainerStarted","Data":"1b126219341c112fb777778469716581bdd0eeeeca6074b1cf506fee6d323cea"} Apr 16 08:40:03.832999 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:03.832955 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-qpdgv" podStartSLOduration=1.8329417110000001 podStartE2EDuration="1.832941711s" podCreationTimestamp="2026-04-16 08:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:40:03.831734648 +0000 UTC m=+412.880311778" watchObservedRunningTime="2026-04-16 08:40:03.832941711 +0000 UTC m=+412.881518841" Apr 16 08:40:18.732571 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.732536 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-x5m82"] Apr 16 08:40:18.737958 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.737939 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.740334 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.740306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 16 08:40:18.740465 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.740397 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-vp6r9\"" Apr 16 08:40:18.740538 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.740472 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:40:18.744659 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.744634 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-x5m82"] Apr 16 08:40:18.789046 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.788993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8tn\" (UniqueName: \"kubernetes.io/projected/054afdce-9736-4d1a-a2f9-6dd5993fca01-kube-api-access-8h8tn\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.789193 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.789125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/054afdce-9736-4d1a-a2f9-6dd5993fca01-tmp\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.889706 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.889677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/054afdce-9736-4d1a-a2f9-6dd5993fca01-tmp\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.889857 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.889730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8tn\" (UniqueName: \"kubernetes.io/projected/054afdce-9736-4d1a-a2f9-6dd5993fca01-kube-api-access-8h8tn\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.890053 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.890015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/054afdce-9736-4d1a-a2f9-6dd5993fca01-tmp\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:18.898694 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:18.898666 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8tn\" (UniqueName: \"kubernetes.io/projected/054afdce-9736-4d1a-a2f9-6dd5993fca01-kube-api-access-8h8tn\") pod \"jobset-operator-747c5859c7-x5m82\" (UID: \"054afdce-9736-4d1a-a2f9-6dd5993fca01\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:19.048442 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:19.048371 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" Apr 16 08:40:19.167002 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:19.166981 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-x5m82"] Apr 16 08:40:19.169346 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:40:19.169319 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054afdce_9736_4d1a_a2f9_6dd5993fca01.slice/crio-5ecfd5f48185e9fbc7505924677ca78cf6647653935c61a830c6ac2be97aa9a5 WatchSource:0}: Error finding container 5ecfd5f48185e9fbc7505924677ca78cf6647653935c61a830c6ac2be97aa9a5: Status 404 returned error can't find the container with id 5ecfd5f48185e9fbc7505924677ca78cf6647653935c61a830c6ac2be97aa9a5 Apr 16 08:40:19.864083 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:19.864045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" event={"ID":"054afdce-9736-4d1a-a2f9-6dd5993fca01","Type":"ContainerStarted","Data":"5ecfd5f48185e9fbc7505924677ca78cf6647653935c61a830c6ac2be97aa9a5"} Apr 16 08:40:21.871744 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:21.871653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" event={"ID":"054afdce-9736-4d1a-a2f9-6dd5993fca01","Type":"ContainerStarted","Data":"34d0dc8604eeaecc3df855b6439ddb65061ec295e2ad83e28297672735d12784"} Apr 16 08:40:21.891239 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:40:21.891194 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-x5m82" podStartSLOduration=1.4542538 podStartE2EDuration="3.891179348s" podCreationTimestamp="2026-04-16 08:40:18 +0000 UTC" firstStartedPulling="2026-04-16 08:40:19.170830773 +0000 UTC m=+428.219407880" lastFinishedPulling="2026-04-16 08:40:21.607756315 +0000 UTC m=+430.656333428" observedRunningTime="2026-04-16 08:40:21.890140267 +0000 UTC m=+430.938717397" watchObservedRunningTime="2026-04-16 08:40:21.891179348 +0000 UTC m=+430.939756476" Apr 16 08:42:33.858888 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.858813 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7785f9f986-698vb"] Apr 16 08:42:33.861928 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.861909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.876627 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.876600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7785f9f986-698vb"] Apr 16 08:42:33.921012 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.920989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-service-ca\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921012 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghr8r\" (UniqueName: \"kubernetes.io/projected/d699de42-74b0-483e-ae63-018a09005d0c-kube-api-access-ghr8r\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921174 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-console-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921174 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-trusted-ca-bundle\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921174 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921274 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-oauth-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:33.921274 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:33.921242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-oauth-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.021738 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.021707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-oauth-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.021890 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.021750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-oauth-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.021890 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.021777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-service-ca\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022049 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghr8r\" (UniqueName: \"kubernetes.io/projected/d699de42-74b0-483e-ae63-018a09005d0c-kube-api-access-ghr8r\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022118 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-console-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022175 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-trusted-ca-bundle\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022175 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022644 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-service-ca\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022644 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-oauth-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.022992 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-trusted-ca-bundle\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.023059 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.022975 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d699de42-74b0-483e-ae63-018a09005d0c-console-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.024259 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.024242 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-oauth-config\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.024578 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.024562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d699de42-74b0-483e-ae63-018a09005d0c-console-serving-cert\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.031447 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.031427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghr8r\" (UniqueName: \"kubernetes.io/projected/d699de42-74b0-483e-ae63-018a09005d0c-kube-api-access-ghr8r\") pod \"console-7785f9f986-698vb\" (UID: \"d699de42-74b0-483e-ae63-018a09005d0c\") " pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.171284 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.171204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:34.291224 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:34.291200 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7785f9f986-698vb"] Apr 16 08:42:34.292980 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:42:34.292954 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd699de42_74b0_483e_ae63_018a09005d0c.slice/crio-a4c9c25e15966c587c99a9a887f20b21862acd1ad0ab2ab44e2fe0ee60f21980 WatchSource:0}: Error finding container a4c9c25e15966c587c99a9a887f20b21862acd1ad0ab2ab44e2fe0ee60f21980: Status 404 returned error can't find the container with id a4c9c25e15966c587c99a9a887f20b21862acd1ad0ab2ab44e2fe0ee60f21980 Apr 16 08:42:35.293637 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:35.293602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7785f9f986-698vb" event={"ID":"d699de42-74b0-483e-ae63-018a09005d0c","Type":"ContainerStarted","Data":"fa853a0daf0051ca897166ed6abed6d5b1e96633c974f14f5cd593ca99528ea6"} Apr 16 08:42:35.293637 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:35.293641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7785f9f986-698vb" event={"ID":"d699de42-74b0-483e-ae63-018a09005d0c","Type":"ContainerStarted","Data":"a4c9c25e15966c587c99a9a887f20b21862acd1ad0ab2ab44e2fe0ee60f21980"} Apr 16 08:42:35.313554 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:35.313509 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7785f9f986-698vb" podStartSLOduration=2.313496655 podStartE2EDuration="2.313496655s" podCreationTimestamp="2026-04-16 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:42:35.311567095 +0000 UTC m=+564.360144223" watchObservedRunningTime="2026-04-16 08:42:35.313496655 +0000 UTC m=+564.362073783" Apr 16 08:42:39.768728 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.768695 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph"] Apr 16 08:42:39.771938 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.771921 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:42:39.774683 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.774659 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-rpqzd\"/\"kube-root-ca.crt\"" Apr 16 08:42:39.775009 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.774991 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-rpqzd\"/\"default-dockercfg-km7mc\"" Apr 16 08:42:39.775432 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.775414 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-rpqzd\"/\"openshift-service-ca.crt\"" Apr 16 08:42:39.782826 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.782805 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph"] Apr 16 08:42:39.868493 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.868464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksdt\" (UniqueName: \"kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt\") pod \"test-trainjob-5hfnn-node-0-0-pw8ph\" (UID: \"091c4921-136d-488c-831e-60f917984d87\") " pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:42:39.969777 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.969737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksdt\" (UniqueName: \"kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt\") pod \"test-trainjob-5hfnn-node-0-0-pw8ph\" (UID: \"091c4921-136d-488c-831e-60f917984d87\") " pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:42:39.979172 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:39.979142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksdt\" (UniqueName: \"kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt\") pod \"test-trainjob-5hfnn-node-0-0-pw8ph\" (UID: \"091c4921-136d-488c-831e-60f917984d87\") " pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:42:40.081276 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:40.081210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:42:40.203785 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:40.203763 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph"] Apr 16 08:42:40.206249 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:42:40.206213 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod091c4921_136d_488c_831e_60f917984d87.slice/crio-f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a WatchSource:0}: Error finding container f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a: Status 404 returned error can't find the container with id f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a Apr 16 08:42:40.312735 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:40.312709 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" event={"ID":"091c4921-136d-488c-831e-60f917984d87","Type":"ContainerStarted","Data":"f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a"} Apr 16 08:42:44.171331 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:44.171296 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:44.171872 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:44.171616 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:44.178582 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:44.178557 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:44.348397 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:44.348366 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7785f9f986-698vb" Apr 16 08:42:44.401103 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:42:44.400471 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:43:09.436693 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:09.436629 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cffff7fdf-nhfpr" podUID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" containerName="console" containerID="cri-o://3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e" gracePeriod=15 Apr 16 08:43:09.953982 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:09.953955 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cffff7fdf-nhfpr_c49613c2-4be5-4228-968e-23cb0ef3c4a0/console/0.log" Apr 16 08:43:09.954122 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:09.954052 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:43:10.061602 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061784 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061618 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061784 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061751 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061874 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061808 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061912 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061871 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061944 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061918 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.061994 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.061954 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cllnl\" (UniqueName: \"kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl\") pod \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\" (UID: \"c49613c2-4be5-4228-968e-23cb0ef3c4a0\") " Apr 16 08:43:10.062100 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062077 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:43:10.062223 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062186 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:43:10.062322 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062292 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config" (OuterVolumeSpecName: "console-config") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:43:10.062402 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062371 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca" (OuterVolumeSpecName: "service-ca") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:43:10.062402 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062382 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-trusted-ca-bundle\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.062500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062420 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-oauth-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.062500 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.062437 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.064598 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.064535 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:43:10.064719 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.064668 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl" (OuterVolumeSpecName: "kube-api-access-cllnl") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "kube-api-access-cllnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:43:10.064855 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.064730 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c49613c2-4be5-4228-968e-23cb0ef3c4a0" (UID: "c49613c2-4be5-4228-968e-23cb0ef3c4a0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:43:10.163013 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.162978 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c49613c2-4be5-4228-968e-23cb0ef3c4a0-service-ca\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.163013 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.163014 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-serving-cert\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.163255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.163047 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c49613c2-4be5-4228-968e-23cb0ef3c4a0-console-oauth-config\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.163255 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.163057 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cllnl\" (UniqueName: \"kubernetes.io/projected/c49613c2-4be5-4228-968e-23cb0ef3c4a0-kube-api-access-cllnl\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:43:10.466362 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cffff7fdf-nhfpr_c49613c2-4be5-4228-968e-23cb0ef3c4a0/console/0.log" Apr 16 08:43:10.466745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466369 2574 generic.go:358] "Generic (PLEG): container finished" podID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" containerID="3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e" exitCode=2 Apr 16 08:43:10.466745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466400 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cffff7fdf-nhfpr" event={"ID":"c49613c2-4be5-4228-968e-23cb0ef3c4a0","Type":"ContainerDied","Data":"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e"} Apr 16 08:43:10.466745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cffff7fdf-nhfpr" event={"ID":"c49613c2-4be5-4228-968e-23cb0ef3c4a0","Type":"ContainerDied","Data":"4157d8b788857e272ed0b6b34ea40e44833e734f2f550a190220f3df526a6b3d"} Apr 16 08:43:10.466745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466438 2574 scope.go:117] "RemoveContainer" containerID="3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e" Apr 16 08:43:10.466745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.466452 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cffff7fdf-nhfpr" Apr 16 08:43:10.474707 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.474688 2574 scope.go:117] "RemoveContainer" containerID="3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e" Apr 16 08:43:10.474948 ip-10-0-128-115 kubenswrapper[2574]: E0416 08:43:10.474931 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e\": container with ID starting with 3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e not found: ID does not exist" containerID="3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e" Apr 16 08:43:10.475004 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.474956 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e"} err="failed to get container status \"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e\": rpc error: code = NotFound desc = could not find container \"3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e\": container with ID starting with 3b9440f571cbb57a876e47967c89560e42ef3211afa69a4bd7ee4cfcc670fa1e not found: ID does not exist" Apr 16 08:43:10.491511 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.491488 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:43:10.499703 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:10.499686 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cffff7fdf-nhfpr"] Apr 16 08:43:11.430532 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:11.430507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:43:11.431125 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:11.431104 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:43:11.540756 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:43:11.540711 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" path="/var/lib/kubelet/pods/c49613c2-4be5-4228-968e-23cb0ef3c4a0/volumes" Apr 16 08:47:14.309919 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:14.309883 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" event={"ID":"091c4921-136d-488c-831e-60f917984d87","Type":"ContainerStarted","Data":"55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a"} Apr 16 08:47:14.336439 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:14.336386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" podStartSLOduration=1.742873144 podStartE2EDuration="4m35.336370453s" podCreationTimestamp="2026-04-16 08:42:39 +0000 UTC" firstStartedPulling="2026-04-16 08:42:40.208005018 +0000 UTC m=+569.256582125" lastFinishedPulling="2026-04-16 08:47:13.801502327 +0000 UTC m=+842.850079434" observedRunningTime="2026-04-16 08:47:14.335709778 +0000 UTC m=+843.384286907" watchObservedRunningTime="2026-04-16 08:47:14.336370453 +0000 UTC m=+843.384947584" Apr 16 08:47:20.330840 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:20.330808 2574 generic.go:358] "Generic (PLEG): container finished" podID="091c4921-136d-488c-831e-60f917984d87" containerID="55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a" exitCode=0 Apr 16 08:47:20.331273 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:20.330871 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" event={"ID":"091c4921-136d-488c-831e-60f917984d87","Type":"ContainerDied","Data":"55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a"} Apr 16 08:47:21.596414 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:21.596389 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:47:21.673152 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:21.673127 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ksdt\" (UniqueName: \"kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt\") pod \"091c4921-136d-488c-831e-60f917984d87\" (UID: \"091c4921-136d-488c-831e-60f917984d87\") " Apr 16 08:47:21.675198 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:21.675169 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt" (OuterVolumeSpecName: "kube-api-access-7ksdt") pod "091c4921-136d-488c-831e-60f917984d87" (UID: "091c4921-136d-488c-831e-60f917984d87"). InnerVolumeSpecName "kube-api-access-7ksdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:47:21.774296 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:21.774270 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ksdt\" (UniqueName: \"kubernetes.io/projected/091c4921-136d-488c-831e-60f917984d87-kube-api-access-7ksdt\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:47:22.338943 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:22.338914 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" Apr 16 08:47:22.339195 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:22.338917 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-rpqzd/test-trainjob-5hfnn-node-0-0-pw8ph" event={"ID":"091c4921-136d-488c-831e-60f917984d87","Type":"ContainerDied","Data":"f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a"} Apr 16 08:47:22.339195 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:22.339039 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f209de8918e4ee7b2755903413d6a4d285e7b84fbbf39ee413a05b2db3a88b9a" Apr 16 08:47:23.277138 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277107 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh"] Apr 16 08:47:23.277490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277415 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="091c4921-136d-488c-831e-60f917984d87" containerName="node" Apr 16 08:47:23.277490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277427 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="091c4921-136d-488c-831e-60f917984d87" containerName="node" Apr 16 08:47:23.277490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277444 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" containerName="console" Apr 16 08:47:23.277490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277451 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" containerName="console" Apr 16 08:47:23.277617 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277508 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c49613c2-4be5-4228-968e-23cb0ef3c4a0" containerName="console" Apr 16 08:47:23.277617 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.277520 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="091c4921-136d-488c-831e-60f917984d87" containerName="node" Apr 16 08:47:23.512833 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.512800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh"] Apr 16 08:47:23.512988 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.512900 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:47:23.515707 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.515689 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bt7gs\"/\"openshift-service-ca.crt\"" Apr 16 08:47:23.516565 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.516546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-bt7gs\"/\"default-dockercfg-mqw42\"" Apr 16 08:47:23.516663 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.516581 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-bt7gs\"/\"kube-root-ca.crt\"" Apr 16 08:47:23.589527 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.589465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr6pj\" (UniqueName: \"kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj\") pod \"test-trainjob-c4269-node-0-0-2ncvh\" (UID: \"afef1db0-5af6-4cab-a890-4102e3309a03\") " pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:47:23.690449 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.690420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr6pj\" (UniqueName: \"kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj\") pod \"test-trainjob-c4269-node-0-0-2ncvh\" (UID: \"afef1db0-5af6-4cab-a890-4102e3309a03\") " pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:47:23.699513 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.699485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr6pj\" (UniqueName: \"kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj\") pod \"test-trainjob-c4269-node-0-0-2ncvh\" (UID: \"afef1db0-5af6-4cab-a890-4102e3309a03\") " pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:47:23.821404 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.821369 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:47:23.939455 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.939429 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh"] Apr 16 08:47:23.941566 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:47:23.941533 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafef1db0_5af6_4cab_a890_4102e3309a03.slice/crio-45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c WatchSource:0}: Error finding container 45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c: Status 404 returned error can't find the container with id 45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c Apr 16 08:47:23.943666 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:23.943644 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:47:24.347167 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:47:24.347134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" event={"ID":"afef1db0-5af6-4cab-a890-4102e3309a03","Type":"ContainerStarted","Data":"45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c"} Apr 16 08:48:11.453427 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:48:11.453350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:48:11.455110 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:48:11.455088 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:51:27.223088 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:27.223051 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" event={"ID":"afef1db0-5af6-4cab-a890-4102e3309a03","Type":"ContainerStarted","Data":"88f3dfe0490ee062dde850db8387ba6e75714f899444b85d4025e9d422cb3e8a"} Apr 16 08:51:27.258510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:27.258463 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" podStartSLOduration=2.074570744 podStartE2EDuration="4m4.258449648s" podCreationTimestamp="2026-04-16 08:47:23 +0000 UTC" firstStartedPulling="2026-04-16 08:47:23.943845769 +0000 UTC m=+852.992422881" lastFinishedPulling="2026-04-16 08:51:26.127724677 +0000 UTC m=+1095.176301785" observedRunningTime="2026-04-16 08:51:27.25651818 +0000 UTC m=+1096.305095306" watchObservedRunningTime="2026-04-16 08:51:27.258449648 +0000 UTC m=+1096.307026778" Apr 16 08:51:33.243743 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:33.243705 2574 generic.go:358] "Generic (PLEG): container finished" podID="afef1db0-5af6-4cab-a890-4102e3309a03" containerID="88f3dfe0490ee062dde850db8387ba6e75714f899444b85d4025e9d422cb3e8a" exitCode=0 Apr 16 08:51:33.244168 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:33.243757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" event={"ID":"afef1db0-5af6-4cab-a890-4102e3309a03","Type":"ContainerDied","Data":"88f3dfe0490ee062dde850db8387ba6e75714f899444b85d4025e9d422cb3e8a"} Apr 16 08:51:34.412009 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:34.411988 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:51:34.497453 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:34.497392 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr6pj\" (UniqueName: \"kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj\") pod \"afef1db0-5af6-4cab-a890-4102e3309a03\" (UID: \"afef1db0-5af6-4cab-a890-4102e3309a03\") " Apr 16 08:51:34.499526 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:34.499496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj" (OuterVolumeSpecName: "kube-api-access-kr6pj") pod "afef1db0-5af6-4cab-a890-4102e3309a03" (UID: "afef1db0-5af6-4cab-a890-4102e3309a03"). InnerVolumeSpecName "kube-api-access-kr6pj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:51:34.598178 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:34.598144 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kr6pj\" (UniqueName: \"kubernetes.io/projected/afef1db0-5af6-4cab-a890-4102e3309a03-kube-api-access-kr6pj\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:51:35.250675 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.250646 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" Apr 16 08:51:35.250675 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.250667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-bt7gs/test-trainjob-c4269-node-0-0-2ncvh" event={"ID":"afef1db0-5af6-4cab-a890-4102e3309a03","Type":"ContainerDied","Data":"45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c"} Apr 16 08:51:35.250875 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.250696 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d27283a105a994ada9559b2cb32a62707b8239fc354626d29eb35f39e6278c" Apr 16 08:51:35.741470 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.741436 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx"] Apr 16 08:51:35.741873 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.741742 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afef1db0-5af6-4cab-a890-4102e3309a03" containerName="node" Apr 16 08:51:35.741873 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.741754 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="afef1db0-5af6-4cab-a890-4102e3309a03" containerName="node" Apr 16 08:51:35.741873 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.741808 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="afef1db0-5af6-4cab-a890-4102e3309a03" containerName="node" Apr 16 08:51:35.773768 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.773727 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx"] Apr 16 08:51:35.773936 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.773857 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:51:35.776486 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.776456 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qt5wp\"/\"openshift-service-ca.crt\"" Apr 16 08:51:35.776611 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.776456 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-qt5wp\"/\"kube-root-ca.crt\"" Apr 16 08:51:35.777313 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.777299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-qt5wp\"/\"default-dockercfg-p9vx9\"" Apr 16 08:51:35.808889 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.808867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vd4\" (UniqueName: \"kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4\") pod \"test-trainjob-frn5v-node-0-0-r8ccx\" (UID: \"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0\") " pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:51:35.910089 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.910050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8vd4\" (UniqueName: \"kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4\") pod \"test-trainjob-frn5v-node-0-0-r8ccx\" (UID: \"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0\") " pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:51:35.920507 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:35.920474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8vd4\" (UniqueName: \"kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4\") pod \"test-trainjob-frn5v-node-0-0-r8ccx\" (UID: \"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0\") " pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:51:36.083220 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:36.083136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:51:36.203091 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:36.203064 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx"] Apr 16 08:51:36.204739 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:51:36.204715 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b34e1f_ba9f_4dc0_a332_65fb04f12fc0.slice/crio-67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6 WatchSource:0}: Error finding container 67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6: Status 404 returned error can't find the container with id 67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6 Apr 16 08:51:36.255248 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:51:36.255217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" event={"ID":"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0","Type":"ContainerStarted","Data":"67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6"} Apr 16 08:52:55.554659 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:55.554620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" event={"ID":"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0","Type":"ContainerStarted","Data":"c4abeaa0f04dcb7fecb07c2cc83f165ff7082fdd4485a4b717cdb5f5c9734c2a"} Apr 16 08:52:55.577745 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:55.577672 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" podStartSLOduration=1.794812482 podStartE2EDuration="1m20.577654065s" podCreationTimestamp="2026-04-16 08:51:35 +0000 UTC" firstStartedPulling="2026-04-16 08:51:36.206760756 +0000 UTC m=+1105.255337863" lastFinishedPulling="2026-04-16 08:52:54.989602338 +0000 UTC m=+1184.038179446" observedRunningTime="2026-04-16 08:52:55.575150128 +0000 UTC m=+1184.623727258" watchObservedRunningTime="2026-04-16 08:52:55.577654065 +0000 UTC m=+1184.626231196" Apr 16 08:52:58.565797 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:58.565761 2574 generic.go:358] "Generic (PLEG): container finished" podID="a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0" containerID="c4abeaa0f04dcb7fecb07c2cc83f165ff7082fdd4485a4b717cdb5f5c9734c2a" exitCode=0 Apr 16 08:52:58.566238 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:58.565824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" event={"ID":"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0","Type":"ContainerDied","Data":"c4abeaa0f04dcb7fecb07c2cc83f165ff7082fdd4485a4b717cdb5f5c9734c2a"} Apr 16 08:52:59.768834 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:59.768811 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:52:59.869291 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:59.869225 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8vd4\" (UniqueName: \"kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4\") pod \"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0\" (UID: \"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0\") " Apr 16 08:52:59.871272 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:59.871248 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4" (OuterVolumeSpecName: "kube-api-access-v8vd4") pod "a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0" (UID: "a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0"). InnerVolumeSpecName "kube-api-access-v8vd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:52:59.969751 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:52:59.969729 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8vd4\" (UniqueName: \"kubernetes.io/projected/a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0-kube-api-access-v8vd4\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:53:00.573314 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:00.573286 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" Apr 16 08:53:00.573496 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:00.573318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-qt5wp/test-trainjob-frn5v-node-0-0-r8ccx" event={"ID":"a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0","Type":"ContainerDied","Data":"67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6"} Apr 16 08:53:00.573496 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:00.573343 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ecc164d91c9d5ea6fd01217f4606b3b217dada9c0a722edaba56cf8d98dec6" Apr 16 08:53:01.140424 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.140394 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk"] Apr 16 08:53:01.140855 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.140694 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0" containerName="node" Apr 16 08:53:01.140855 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.140705 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0" containerName="node" Apr 16 08:53:01.140855 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.140765 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8b34e1f-ba9f-4dc0-a332-65fb04f12fc0" containerName="node" Apr 16 08:53:01.210870 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.210840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk"] Apr 16 08:53:01.211014 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.210939 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:53:01.213468 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.213443 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x6xd5\"/\"openshift-service-ca.crt\"" Apr 16 08:53:01.213574 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.213452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-x6xd5\"/\"default-dockercfg-npfdc\"" Apr 16 08:53:01.214266 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.214250 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x6xd5\"/\"kube-root-ca.crt\"" Apr 16 08:53:01.279633 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.279610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j42kf\" (UniqueName: \"kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf\") pod \"test-trainjob-294gj-node-0-0-mmbsk\" (UID: \"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f\") " pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:53:01.380481 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.380455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j42kf\" (UniqueName: \"kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf\") pod \"test-trainjob-294gj-node-0-0-mmbsk\" (UID: \"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f\") " pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:53:01.390050 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.390011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j42kf\" (UniqueName: \"kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf\") pod \"test-trainjob-294gj-node-0-0-mmbsk\" (UID: \"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f\") " pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:53:01.520008 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.519981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:53:01.685618 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.685588 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk"] Apr 16 08:53:01.687552 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:53:01.687528 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5fe7cbd_a433_4a0b_bb9c_c41d549a616f.slice/crio-5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a WatchSource:0}: Error finding container 5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a: Status 404 returned error can't find the container with id 5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a Apr 16 08:53:01.689675 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:01.689653 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:53:02.585427 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:02.585371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" event={"ID":"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f","Type":"ContainerStarted","Data":"5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a"} Apr 16 08:53:14.119670 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:14.119637 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:53:14.119670 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:53:14.119649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:59:31.167470 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:31.167441 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:59:31.167996 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:31.167658 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 08:59:32.952883 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:32.952847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" event={"ID":"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f","Type":"ContainerStarted","Data":"cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e"} Apr 16 08:59:32.955981 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:32.955963 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-x6xd5\"/\"default-dockercfg-npfdc\"" Apr 16 08:59:33.004073 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:33.004013 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" podStartSLOduration=1.431364235 podStartE2EDuration="6m32.003998211s" podCreationTimestamp="2026-04-16 08:53:01 +0000 UTC" firstStartedPulling="2026-04-16 08:53:01.689802777 +0000 UTC m=+1190.738379884" lastFinishedPulling="2026-04-16 08:59:32.262436738 +0000 UTC m=+1581.311013860" observedRunningTime="2026-04-16 08:59:33.000926857 +0000 UTC m=+1582.049504007" watchObservedRunningTime="2026-04-16 08:59:33.003998211 +0000 UTC m=+1582.052575337" Apr 16 08:59:33.091251 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:33.091224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x6xd5\"/\"kube-root-ca.crt\"" Apr 16 08:59:33.102115 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:33.102093 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-x6xd5\"/\"openshift-service-ca.crt\"" Apr 16 08:59:35.964364 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:35.964333 2574 generic.go:358] "Generic (PLEG): container finished" podID="f5fe7cbd-a433-4a0b-bb9c-c41d549a616f" containerID="cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e" exitCode=0 Apr 16 08:59:35.964668 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:35.964370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" event={"ID":"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f","Type":"ContainerDied","Data":"cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e"} Apr 16 08:59:37.089210 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.089188 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:59:37.225263 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.225191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j42kf\" (UniqueName: \"kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf\") pod \"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f\" (UID: \"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f\") " Apr 16 08:59:37.227324 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.227291 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf" (OuterVolumeSpecName: "kube-api-access-j42kf") pod "f5fe7cbd-a433-4a0b-bb9c-c41d549a616f" (UID: "f5fe7cbd-a433-4a0b-bb9c-c41d549a616f"). InnerVolumeSpecName "kube-api-access-j42kf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:59:37.326446 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.326419 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j42kf\" (UniqueName: \"kubernetes.io/projected/f5fe7cbd-a433-4a0b-bb9c-c41d549a616f-kube-api-access-j42kf\") on node \"ip-10-0-128-115.ec2.internal\" DevicePath \"\"" Apr 16 08:59:37.972082 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.972043 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" Apr 16 08:59:37.972247 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.972059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-x6xd5/test-trainjob-294gj-node-0-0-mmbsk" event={"ID":"f5fe7cbd-a433-4a0b-bb9c-c41d549a616f","Type":"ContainerDied","Data":"5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a"} Apr 16 08:59:37.972247 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:37.972154 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d97dc7490e89e13dc210b5ce7cc6a9aac08f3d8989d8e65a28bd409b5e27e0a" Apr 16 08:59:38.530225 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.530194 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq"] Apr 16 08:59:38.530576 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.530506 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5fe7cbd-a433-4a0b-bb9c-c41d549a616f" containerName="node" Apr 16 08:59:38.530576 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.530516 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5fe7cbd-a433-4a0b-bb9c-c41d549a616f" containerName="node" Apr 16 08:59:38.530576 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.530573 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5fe7cbd-a433-4a0b-bb9c-c41d549a616f" containerName="node" Apr 16 08:59:38.550110 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.550082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq"] Apr 16 08:59:38.550263 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.550190 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" Apr 16 08:59:38.552510 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.552480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-h4wbt\"/\"kube-root-ca.crt\"" Apr 16 08:59:38.552641 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.552494 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-h4wbt\"/\"openshift-service-ca.crt\"" Apr 16 08:59:38.552641 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.552494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-h4wbt\"/\"default-dockercfg-9wswn\"" Apr 16 08:59:38.638367 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.638332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8wt\" (UniqueName: \"kubernetes.io/projected/933de4fa-d5fb-4306-9f51-0c7f50ddb9be-kube-api-access-4b8wt\") pod \"test-trainjob-n877d-node-0-0-66sbq\" (UID: \"933de4fa-d5fb-4306-9f51-0c7f50ddb9be\") " pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" Apr 16 08:59:38.739256 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.739218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8wt\" (UniqueName: \"kubernetes.io/projected/933de4fa-d5fb-4306-9f51-0c7f50ddb9be-kube-api-access-4b8wt\") pod \"test-trainjob-n877d-node-0-0-66sbq\" (UID: \"933de4fa-d5fb-4306-9f51-0c7f50ddb9be\") " pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" Apr 16 08:59:38.749574 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.749552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8wt\" (UniqueName: \"kubernetes.io/projected/933de4fa-d5fb-4306-9f51-0c7f50ddb9be-kube-api-access-4b8wt\") pod \"test-trainjob-n877d-node-0-0-66sbq\" (UID: \"933de4fa-d5fb-4306-9f51-0c7f50ddb9be\") " pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" Apr 16 08:59:38.861808 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.861740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" Apr 16 08:59:38.983809 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.983783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq"] Apr 16 08:59:38.985640 ip-10-0-128-115 kubenswrapper[2574]: W0416 08:59:38.985612 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod933de4fa_d5fb_4306_9f51_0c7f50ddb9be.slice/crio-af079476ba741b20b5a6dbe5ae2ae88fa7bfddc726f9cee91878a8b63771284c WatchSource:0}: Error finding container af079476ba741b20b5a6dbe5ae2ae88fa7bfddc726f9cee91878a8b63771284c: Status 404 returned error can't find the container with id af079476ba741b20b5a6dbe5ae2ae88fa7bfddc726f9cee91878a8b63771284c Apr 16 08:59:38.987490 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:38.987470 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:59:39.980616 ip-10-0-128-115 kubenswrapper[2574]: I0416 08:59:39.980571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" event={"ID":"933de4fa-d5fb-4306-9f51-0c7f50ddb9be","Type":"ContainerStarted","Data":"af079476ba741b20b5a6dbe5ae2ae88fa7bfddc726f9cee91878a8b63771284c"} Apr 16 09:04:31.191185 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:04:31.191108 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:04:31.192050 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:04:31.192010 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:05:37.493843 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:05:37.493804 2574 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 16 09:05:37.494444 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:05:37.493865 2574 container_gc.go:86] "Attempting to delete unused containers" Apr 16 09:05:37.495421 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:05:37.495398 2574 scope.go:117] "RemoveContainer" containerID="55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a" Apr 16 09:05:39.643172 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:05:39.643138 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasDiskPressure" Apr 16 09:07:31.195600 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:07:31.195510 2574 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 16 09:07:31.195600 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:07:31.195564 2574 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 09:07:31.195600 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:07:31.195583 2574 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 16 09:07:37.496269 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:07:37.496226 2574 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a" Apr 16 09:07:37.496269 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:07:37.496277 2574 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="55f257b7a83d167a7dce9dfb8999bae773ff7ef9e86d59fea9a801cf36ee2b0a" Apr 16 09:07:37.496806 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:07:37.496297 2574 scope.go:117] "RemoveContainer" containerID="cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e" Apr 16 09:09:37.497478 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:37.497365 2574 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e" Apr 16 09:09:37.497478 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:37.497437 2574 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" containerID="cc354a2a32b11048b1e4ce4771e00add5b7793930a8104ac0448e8d20024aa1e" Apr 16 09:09:37.497478 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:37.497464 2574 scope.go:117] "RemoveContainer" containerID="c4abeaa0f04dcb7fecb07c2cc83f165ff7082fdd4485a4b717cdb5f5c9734c2a" Apr 16 09:09:45.040734 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.040706 2574 scope.go:117] "RemoveContainer" containerID="88f3dfe0490ee062dde850db8387ba6e75714f899444b85d4025e9d422cb3e8a" Apr 16 09:09:45.097877 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.097856 2574 image_gc_manager.go:447] "Attempting to delete unused images" Apr 16 09:09:45.104122 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.104102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:09:45.104616 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.104595 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:09:45.111736 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.111712 2574 image_gc_manager.go:391] "Disk usage on image filesystem is over the high threshold, trying to free bytes down to the low threshold" usage=100 highThreshold=85 amountToFree=25648785817 lowThreshold=80 Apr 16 09:09:45.111736 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.111736 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" size=1065432607 runtimeHandler="" Apr 16 09:09:45.112408 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.112374 2574 log.go:32] "RemoveImage from image service failed" err="rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" image="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" Apr 16 09:09:45.112493 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.112418 2574 kuberuntime_image.go:137] "Failed to remove image" err="rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" image="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" Apr 16 09:09:45.112493 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.112431 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="ddcfeee20e1ff551b445c3628005b77e942ea4ac3d8392e643c50a8a475c3949" size=1064972629 runtimeHandler="" Apr 16 09:09:45.115849 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.115833 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:09:45.120061 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.120042 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" size=1065432607 runtimeHandler="" Apr 16 09:09:45.622748 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.622710 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 16 09:09:45.623005 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.622975 2574 log.go:32] "RemoveImage from image service failed" err="rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" image="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" Apr 16 09:09:45.623128 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.623013 2574 kuberuntime_image.go:137] "Failed to remove image" err="rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" image="3038b4f9b9b980b9b22e6ca050370d17ba44cc8e44a875b6e8bffe0bb887ba51" Apr 16 09:09:45.623128 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:45.623044 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="ddcfeee20e1ff551b445c3628005b77e942ea4ac3d8392e643c50a8a475c3949" size=1064972629 runtimeHandler="" Apr 16 09:09:45.727969 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.727927 2574 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/librocsolver.so.0.4.60403: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 16 09:09:45.728200 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.728143 2574 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-n877d-node-0-0.test-trainjob-n877d,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4b8wt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-n877d-node-0-0-66sbq_test-ns-h4wbt(933de4fa-d5fb-4306-9f51-0c7f50ddb9be): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/librocsolver.so.0.4.60403: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 16 09:09:45.729345 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:45.729316 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/librocsolver.so.0.4.60403: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" podUID="933de4fa-d5fb-4306-9f51-0c7f50ddb9be" Apr 16 09:09:46.132512 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:46.132485 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-h4wbt\"/\"default-dockercfg-9wswn\"" Apr 16 09:09:46.214800 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:46.214773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-h4wbt\"/\"kube-root-ca.crt\"" Apr 16 09:09:46.225260 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:46.225243 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-h4wbt\"/\"openshift-service-ca.crt\"" Apr 16 09:09:49.462277 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:49.461744 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 16 09:09:49.462277 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:49.461744 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="1f369340181d991e38175950af0cc351e6c2eaddca22075fb9259c214ef5b9c9" size=7588072888 runtimeHandler="" Apr 16 09:09:49.462277 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:49.462071 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 09:09:49.462277 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:49.462254 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="65e84aed5e78009bfc4af6cd682cca48f00a6e4317ab8d2b37f037ee1b735dc8" size=23199586225 runtimeHandler="" Apr 16 09:09:49.462704 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:49.462290 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/librocsolver.so.0.4.60403: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-h4wbt/test-trainjob-n877d-node-0-0-66sbq" podUID="933de4fa-d5fb-4306-9f51-0c7f50ddb9be" Apr 16 09:09:52.458152 ip-10-0-128-115 kubenswrapper[2574]: E0416 09:09:52.458108 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="wanted to free 25648785817 bytes, but freed 29459917620 bytes space with errors in image deletion: rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" Apr 16 09:09:56.582918 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:56.582883 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="1f369340181d991e38175950af0cc351e6c2eaddca22075fb9259c214ef5b9c9" size=7588072888 runtimeHandler="" Apr 16 09:09:56.583355 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:56.583327 2574 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 16 09:09:59.454285 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:59.454249 2574 eviction_manager.go:473] "Eviction manager: unexpected error when attempting to reduce resource pressure" resourceName="ephemeral-storage" err="wanted to free 9223372036854775807 bytes, but freed 71532962066 bytes space with errors in image deletion: rpc error: code = Unknown desc = delete image: image used by 1c48658e592f2592ce7b8c4fd41c4909e9781d6e6f9eb291cee4a7dd2e1462a6: image is in use by a container" Apr 16 09:09:59.462139 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:09:59.462113 2574 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 16 09:10:46.552638 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:10:46.552562 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-115.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 09:13:31.808656 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.808572 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sqkd/must-gather-8dbk4"] Apr 16 09:13:31.812584 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.812566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:31.814795 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.814770 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"kube-root-ca.crt\"" Apr 16 09:13:31.815655 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.815629 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sqkd\"/\"default-dockercfg-kg5b9\"" Apr 16 09:13:31.815754 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.815657 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"openshift-service-ca.crt\"" Apr 16 09:13:31.820930 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.820908 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/must-gather-8dbk4"] Apr 16 09:13:31.947069 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.947013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmxm\" (UniqueName: \"kubernetes.io/projected/91b1ab90-6574-4f00-b4b5-0032e9bd549f-kube-api-access-vtmxm\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:31.947217 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:31.947136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b1ab90-6574-4f00-b4b5-0032e9bd549f-must-gather-output\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.048189 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.048152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmxm\" (UniqueName: \"kubernetes.io/projected/91b1ab90-6574-4f00-b4b5-0032e9bd549f-kube-api-access-vtmxm\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.048348 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.048225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b1ab90-6574-4f00-b4b5-0032e9bd549f-must-gather-output\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.048586 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.048567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/91b1ab90-6574-4f00-b4b5-0032e9bd549f-must-gather-output\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.057451 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.057403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmxm\" (UniqueName: \"kubernetes.io/projected/91b1ab90-6574-4f00-b4b5-0032e9bd549f-kube-api-access-vtmxm\") pod \"must-gather-8dbk4\" (UID: \"91b1ab90-6574-4f00-b4b5-0032e9bd549f\") " pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.123056 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.122964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" Apr 16 09:13:32.253113 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.253088 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/must-gather-8dbk4"] Apr 16 09:13:32.256153 ip-10-0-128-115 kubenswrapper[2574]: W0416 09:13:32.256124 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b1ab90_6574_4f00_b4b5_0032e9bd549f.slice/crio-07dd283bbe06449cf9d0678e2a6abf34893e5ca0442abaff6f7dce53156cf31c WatchSource:0}: Error finding container 07dd283bbe06449cf9d0678e2a6abf34893e5ca0442abaff6f7dce53156cf31c: Status 404 returned error can't find the container with id 07dd283bbe06449cf9d0678e2a6abf34893e5ca0442abaff6f7dce53156cf31c Apr 16 09:13:32.915224 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:32.915183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" event={"ID":"91b1ab90-6574-4f00-b4b5-0032e9bd549f","Type":"ContainerStarted","Data":"07dd283bbe06449cf9d0678e2a6abf34893e5ca0442abaff6f7dce53156cf31c"} Apr 16 09:13:33.924715 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:33.923925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" event={"ID":"91b1ab90-6574-4f00-b4b5-0032e9bd549f","Type":"ContainerStarted","Data":"0b72e67f3e0cf197540286a0230baa82e0b1fbc07ff1cb89a258d5449b972312"} Apr 16 09:13:33.924715 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:33.923968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" event={"ID":"91b1ab90-6574-4f00-b4b5-0032e9bd549f","Type":"ContainerStarted","Data":"e6a67d66149043b56f775880830467a041ec3fe257d12cb099ebb5edd58809db"} Apr 16 09:13:33.945287 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:33.944675 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sqkd/must-gather-8dbk4" podStartSLOduration=2.121445792 podStartE2EDuration="2.944619272s" podCreationTimestamp="2026-04-16 09:13:31 +0000 UTC" firstStartedPulling="2026-04-16 09:13:32.258397238 +0000 UTC m=+2421.306974345" lastFinishedPulling="2026-04-16 09:13:33.081570715 +0000 UTC m=+2422.130147825" observedRunningTime="2026-04-16 09:13:33.94110694 +0000 UTC m=+2422.989684071" watchObservedRunningTime="2026-04-16 09:13:33.944619272 +0000 UTC m=+2422.993196402" Apr 16 09:13:34.850549 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:34.850509 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-95szv_7a800b20-6dc5-4861-9dc2-f65c151011c7/global-pull-secret-syncer/0.log" Apr 16 09:13:35.034502 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:35.034469 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rllmj_d6cfd862-350d-4749-a71a-3363dc8bbfa0/konnectivity-agent/0.log" Apr 16 09:13:35.085866 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:35.085820 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-115.ec2.internal_69c436ee0b974ec434e2858234467270/haproxy/0.log" Apr 16 09:13:38.656547 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.656507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/alertmanager/0.log" Apr 16 09:13:38.682920 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.682850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/config-reloader/0.log" Apr 16 09:13:38.712552 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.712486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/kube-rbac-proxy-web/0.log" Apr 16 09:13:38.742827 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.742748 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/kube-rbac-proxy/0.log" Apr 16 09:13:38.769432 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.769395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/kube-rbac-proxy-metric/0.log" Apr 16 09:13:38.801924 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.801894 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/prom-label-proxy/0.log" Apr 16 09:13:38.876781 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.876683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_59674a8e-e0d2-4a74-9290-22c5a36c48b1/init-config-reloader/0.log" Apr 16 09:13:38.940423 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:38.940059 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-6q9xt_b6d16575-3414-445e-b597-457d144a72f3/cluster-monitoring-operator/0.log" Apr 16 09:13:39.054221 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.054157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-848cf94df8-z9594_506fe655-8846-49dc-9a6e-28c4bd234649/metrics-server/0.log" Apr 16 09:13:39.082193 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.082136 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-9zc2x_2eed6f1e-3c07-4157-a57f-7a59091a0743/monitoring-plugin/0.log" Apr 16 09:13:39.201454 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.201327 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s8nnn_8dbc5caf-71de-4f52-b43e-7b8c66ccbd78/node-exporter/0.log" Apr 16 09:13:39.229845 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.229813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s8nnn_8dbc5caf-71de-4f52-b43e-7b8c66ccbd78/kube-rbac-proxy/0.log" Apr 16 09:13:39.256549 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.256526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s8nnn_8dbc5caf-71de-4f52-b43e-7b8c66ccbd78/init-textfile/0.log" Apr 16 09:13:39.743538 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.743512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-mt5k4_979e002d-fc74-4d7d-ab0d-89e5d381e244/prometheus-operator/0.log" Apr 16 09:13:39.771533 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.771508 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-mt5k4_979e002d-fc74-4d7d-ab0d-89e5d381e244/kube-rbac-proxy/0.log" Apr 16 09:13:39.843085 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.843054 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b684bb9db-s7xtw_5e19740e-e897-4983-831e-38a5090e217f/telemeter-client/0.log" Apr 16 09:13:39.868229 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.868197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b684bb9db-s7xtw_5e19740e-e897-4983-831e-38a5090e217f/reload/0.log" Apr 16 09:13:39.895370 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:39.895339 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b684bb9db-s7xtw_5e19740e-e897-4983-831e-38a5090e217f/kube-rbac-proxy/0.log" Apr 16 09:13:41.865137 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:41.865104 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7785f9f986-698vb_d699de42-74b0-483e-ae63-018a09005d0c/console/0.log" Apr 16 09:13:41.903567 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:41.903539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-qhwnd_bf11a112-7c77-4855-924a-4cbe4f4b77eb/download-server/0.log" Apr 16 09:13:42.361647 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.361617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-z8dmd_6e233b43-9590-4290-81b4-184a88df4ccf/volume-data-source-validator/0.log" Apr 16 09:13:42.369966 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.369937 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52"] Apr 16 09:13:42.374748 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.374727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.381832 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.381796 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52"] Apr 16 09:13:42.556565 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.556530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-proc\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.556744 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.556642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-sys\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.556744 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.556672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-lib-modules\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.556744 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.556716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djhd\" (UniqueName: \"kubernetes.io/projected/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-kube-api-access-4djhd\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.556883 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.556745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-podres\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657686 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-sys\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657686 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-lib-modules\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4djhd\" (UniqueName: \"kubernetes.io/projected/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-kube-api-access-4djhd\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-lib-modules\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-sys\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-podres\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-proc\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.657907 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-podres\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.658211 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.657929 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-proc\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.667803 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.667776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djhd\" (UniqueName: \"kubernetes.io/projected/c0a9b6c7-7a30-4cf2-b733-5436252ea1de-kube-api-access-4djhd\") pod \"perf-node-gather-daemonset-fdr52\" (UID: \"c0a9b6c7-7a30-4cf2-b733-5436252ea1de\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.687606 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.687577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:42.834993 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.834970 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52"] Apr 16 09:13:42.837182 ip-10-0-128-115 kubenswrapper[2574]: W0416 09:13:42.837147 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc0a9b6c7_7a30_4cf2_b733_5436252ea1de.slice/crio-ae184c2b3110afb52332d8d4a4835215d7716d2685f05778817887193cb81fe7 WatchSource:0}: Error finding container ae184c2b3110afb52332d8d4a4835215d7716d2685f05778817887193cb81fe7: Status 404 returned error can't find the container with id ae184c2b3110afb52332d8d4a4835215d7716d2685f05778817887193cb81fe7 Apr 16 09:13:42.985969 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:42.985932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" event={"ID":"c0a9b6c7-7a30-4cf2-b733-5436252ea1de","Type":"ContainerStarted","Data":"ae184c2b3110afb52332d8d4a4835215d7716d2685f05778817887193cb81fe7"} Apr 16 09:13:43.178530 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.178490 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zc2r7_5ee54e49-cfe0-4681-a0fd-a87ecc0d841c/dns/0.log" Apr 16 09:13:43.199963 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.199940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zc2r7_5ee54e49-cfe0-4681-a0fd-a87ecc0d841c/kube-rbac-proxy/0.log" Apr 16 09:13:43.269575 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.269497 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wqhp5_99c69840-b6dc-47aa-a435-9c9a49111d84/dns-node-resolver/0.log" Apr 16 09:13:43.804643 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.804619 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mvrgd_eacd4fff-e409-4534-945c-507d909b8258/node-ca/0.log" Apr 16 09:13:43.990792 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.990748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" event={"ID":"c0a9b6c7-7a30-4cf2-b733-5436252ea1de","Type":"ContainerStarted","Data":"ac3c1944e8e4349b150c8eef5bb6913223f652c3cb799578a265c656a08607d6"} Apr 16 09:13:43.991252 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:43.990820 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:44.007094 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:44.007040 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" podStartSLOduration=2.00700256 podStartE2EDuration="2.00700256s" podCreationTimestamp="2026-04-16 09:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 09:13:44.005819832 +0000 UTC m=+2433.054396961" watchObservedRunningTime="2026-04-16 09:13:44.00700256 +0000 UTC m=+2433.055579691" Apr 16 09:13:44.915319 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:44.915290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rnmq6_6ac45079-5104-4b55-acd6-dd06367716a0/serve-healthcheck-canary/0.log" Apr 16 09:13:45.288753 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:45.288722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-f5z6f_d76203c8-22cc-48c0-a9be-e10030af2601/insights-operator/1.log" Apr 16 09:13:45.289158 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:45.288874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-f5z6f_d76203c8-22cc-48c0-a9be-e10030af2601/insights-operator/0.log" Apr 16 09:13:45.447830 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:45.447802 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h85fk_e8741a85-3d9a-4923-833d-ff0cdacf96dd/kube-rbac-proxy/0.log" Apr 16 09:13:45.468681 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:45.468657 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h85fk_e8741a85-3d9a-4923-833d-ff0cdacf96dd/exporter/0.log" Apr 16 09:13:45.491725 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:45.491706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h85fk_e8741a85-3d9a-4923-833d-ff0cdacf96dd/extractor/0.log" Apr 16 09:13:47.296779 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:47.296734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-x5m82_054afdce-9736-4d1a-a2f9-6dd5993fca01/jobset-operator/0.log" Apr 16 09:13:50.011130 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:50.010431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-fdr52" Apr 16 09:13:51.216946 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:51.216871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-psv25_feaae171-21c4-4aed-973a-8bfcf22b6913/kube-storage-version-migrator-operator/1.log" Apr 16 09:13:51.219288 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:51.219202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-psv25_feaae171-21c4-4aed-973a-8bfcf22b6913/kube-storage-version-migrator-operator/0.log" Apr 16 09:13:52.513608 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.513575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/kube-multus-additional-cni-plugins/0.log" Apr 16 09:13:52.536213 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.536190 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/egress-router-binary-copy/0.log" Apr 16 09:13:52.558067 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.558043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/cni-plugins/0.log" Apr 16 09:13:52.579912 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.579882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/bond-cni-plugin/0.log" Apr 16 09:13:52.601556 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.601533 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/routeoverride-cni/0.log" Apr 16 09:13:52.623011 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.622990 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/whereabouts-cni-bincopy/0.log" Apr 16 09:13:52.643606 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.643579 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87mwn_994019bc-fe5d-4c20-abc0-f589b27a59ca/whereabouts-cni/0.log" Apr 16 09:13:52.850054 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.849939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b8fml_e1f1bf71-c497-4ba1-8e98-12b8dcdc7dcb/kube-multus/0.log" Apr 16 09:13:52.983708 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:52.983680 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r9fn4_b1290b06-222c-45ae-985a-c88370488114/network-metrics-daemon/0.log" Apr 16 09:13:53.006971 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:53.006950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r9fn4_b1290b06-222c-45ae-985a-c88370488114/kube-rbac-proxy/0.log" Apr 16 09:13:54.092264 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.092231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-controller/0.log" Apr 16 09:13:54.109432 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.109408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/0.log" Apr 16 09:13:54.132414 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.132386 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovn-acl-logging/1.log" Apr 16 09:13:54.156671 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.156647 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/kube-rbac-proxy-node/0.log" Apr 16 09:13:54.180855 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.180770 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 09:13:54.197840 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.197813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/northd/0.log" Apr 16 09:13:54.219611 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.219584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/nbdb/0.log" Apr 16 09:13:54.242925 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.242907 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/sbdb/0.log" Apr 16 09:13:54.431207 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:54.431175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f6wjp_d6bc0f25-3003-4855-b122-6d1820717354/ovnkube-controller/0.log" Apr 16 09:13:55.802676 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:55.802648 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-nzw48_ae71d722-6e3e-4e76-b3a7-90b11657ce93/check-endpoints/0.log" Apr 16 09:13:55.879802 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:55.879757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pq6xw_174399de-7e6b-4315-ba27-7e933c5c30d9/network-check-target-container/0.log" Apr 16 09:13:56.787484 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:56.787453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2xl8x_7f8ea69a-dea8-4b99-b06b-d678bbe4c26e/iptables-alerter/0.log" Apr 16 09:13:57.474559 ip-10-0-128-115 kubenswrapper[2574]: I0416 09:13:57.474533 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6rqnd_ec8a52eb-3122-4a62-bca8-7bf7966c67e7/tuned/0.log"