Apr 23 17:41:08.177234 ip-10-0-131-107 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:41:08.645491 ip-10-0-131-107 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:08.645491 ip-10-0-131-107 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:41:08.645491 ip-10-0-131-107 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:08.645491 ip-10-0-131-107 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:41:08.645491 ip-10-0-131-107 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:08.646378 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.646291 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:41:08.648694 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648678 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648695 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648700 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648703 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648706 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648708 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648711 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648715 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648718 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648722 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648725 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648728 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:08.648729 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648731 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648741 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648744 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648746 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648749 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648752 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648755 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648757 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648760 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648763 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648766 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648768 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648771 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648774 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648777 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648779 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648782 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648784 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648787 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648789 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648792 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:08.649047 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648794 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648797 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648799 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648802 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648804 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648807 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648811 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648815 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648817 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648820 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648823 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648826 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648828 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648832 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648835 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648838 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648840 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648843 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648846 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:08.649591 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648848 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648851 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648853 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648856 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648858 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648860 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648863 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648865 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648868 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648870 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648873 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648875 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648878 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648880 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648883 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648886 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648888 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648890 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648893 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648895 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:08.650077 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648899 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648901 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648904 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648906 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648909 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648911 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648915 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648918 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648921 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648923 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648926 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648929 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648945 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.648948 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649338 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649342 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649345 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649348 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649351 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:08.650543 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649353 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649355 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649358 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649360 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649364 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649367 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649370 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649372 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649375 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649378 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649380 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649383 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649386 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649388 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649391 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649393 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649396 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649399 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649401 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649404 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:08.651017 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649408 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649410 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649413 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649416 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649418 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649420 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649423 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649425 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649428 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649430 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649433 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649435 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649438 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649440 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649442 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649445 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649447 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649450 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649452 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649455 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:08.651509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649457 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649460 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649462 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649465 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649467 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649470 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649474 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649477 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649480 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649482 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649494 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649497 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649500 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649503 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649506 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649508 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649511 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649513 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649517 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649521 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:08.652006 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649524 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649528 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649531 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649534 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649536 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649539 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649541 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649544 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649546 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649549 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649551 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649554 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649558 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649561 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649564 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649567 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649569 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649572 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649575 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649578 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:08.652494 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.649581 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649654 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649661 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649667 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649672 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649676 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649680 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649684 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649689 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649692 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649695 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649698 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649702 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649705 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649708 2579 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649711 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649714 2579 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649716 2579 flags.go:64] FLAG: --cloud-config="" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649719 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649722 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649727 2579 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649730 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649733 2579 flags.go:64] FLAG: --config-dir="" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649736 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649739 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:41:08.653011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649743 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649746 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649749 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649752 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649755 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649758 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649762 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649765 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649768 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649772 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649775 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649778 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649781 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649784 2579 flags.go:64] FLAG: --enable-server="true" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649787 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649791 2579 flags.go:64] FLAG: --event-burst="100" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649794 2579 flags.go:64] FLAG: --event-qps="50" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649797 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649800 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649803 2579 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649807 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649810 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649813 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649816 2579 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649819 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:41:08.653632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649822 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649825 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649828 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649831 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649834 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649836 2579 flags.go:64] FLAG: --feature-gates="" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649840 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649843 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649846 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649849 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649852 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649855 2579 flags.go:64] FLAG: --help="false" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649858 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-131-107.ec2.internal" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649861 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649865 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649868 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649871 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649875 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649877 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649880 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649883 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649886 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649889 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649892 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:41:08.654248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649895 2579 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649898 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649900 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649903 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649906 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649908 2579 flags.go:64] FLAG: --lock-file="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649911 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649914 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649917 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649922 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649926 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649928 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649947 2579 flags.go:64] FLAG: --logging-format="text" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649953 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649958 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649962 2579 flags.go:64] FLAG: --manifest-url="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649968 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649973 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649976 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649980 2579 flags.go:64] FLAG: --max-pods="110" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649983 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649986 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649989 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649992 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649995 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:41:08.654850 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.649998 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650000 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650008 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650011 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650014 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650016 2579 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650020 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650025 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650028 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650031 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650034 2579 flags.go:64] FLAG: --port="10250" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650037 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650040 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0144f7abb819c4410" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650043 2579 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650046 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650049 2579 flags.go:64] FLAG: --register-node="true" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650051 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650055 2579 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650058 2579 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650061 2579 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650064 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650066 2579 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650070 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650073 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650076 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:41:08.655535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650079 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650081 2579 flags.go:64] FLAG: --runonce="false" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650084 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650087 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650091 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650094 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650096 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650100 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650103 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650106 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650109 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650112 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650114 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650117 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650121 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650124 2579 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650126 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650131 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650134 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650137 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650142 2579 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650145 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650147 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650150 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650153 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:41:08.656521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650156 2579 flags.go:64] FLAG: --v="2" Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650161 2579 flags.go:64] FLAG: --version="false" Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650164 2579 flags.go:64] FLAG: --vmodule="" Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650169 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650172 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650262 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650265 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650268 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650271 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650274 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650277 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650279 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650282 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650284 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650287 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650290 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650293 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650296 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650299 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650302 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:08.657379 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650304 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650306 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650309 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650312 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650314 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650316 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650319 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650321 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650324 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650326 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650329 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650333 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650335 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650338 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650340 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650343 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650345 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650348 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650350 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650353 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:08.657906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650355 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650361 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650363 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650366 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650368 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650371 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650373 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650376 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650378 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650380 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650383 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650386 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650388 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650391 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650393 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650396 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650399 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650401 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650404 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650407 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:08.658511 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650409 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650411 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650415 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650420 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650423 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650426 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650429 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650432 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650435 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650437 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650440 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650442 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650444 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650448 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650451 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650453 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650456 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650458 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650461 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:08.659072 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650463 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650466 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650468 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650471 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650473 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650476 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650478 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650480 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650483 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650486 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650488 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.650491 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.650503 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.658995 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.659118 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659166 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:08.659534 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659171 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659175 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659180 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659184 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659187 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659190 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659193 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659195 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659198 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659200 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659203 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659205 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659208 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659211 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659214 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659216 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659219 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659222 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659224 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659227 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:08.659931 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659229 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659233 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659235 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659238 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659240 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659243 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659245 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659247 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659251 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659254 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659256 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659259 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659261 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659264 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659267 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659269 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659272 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659276 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659279 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659282 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:08.660442 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659285 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659287 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659290 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659292 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659295 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659297 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659300 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659302 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659305 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659308 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659310 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659313 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659315 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659318 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659320 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659323 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659325 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659327 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659331 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:08.660948 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659336 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659339 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659342 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659345 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659348 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659351 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659354 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659356 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659359 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659361 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659364 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659366 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659369 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659371 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659373 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659423 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659427 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659430 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659432 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659436 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:08.661413 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659438 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659441 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659444 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659446 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659449 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.659451 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.659457 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660290 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660297 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660301 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660304 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660306 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660310 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660312 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660316 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:08.661906 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660319 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660322 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660326 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660329 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660332 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660335 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660337 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660340 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660343 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660345 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660348 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660350 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660353 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660356 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660358 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660361 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660363 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660366 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660369 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660371 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:08.662299 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660374 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660376 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660379 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660382 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660385 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660387 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660390 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660392 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660395 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660397 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660400 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660402 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660405 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660407 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660410 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660412 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660415 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660417 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660419 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660422 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:08.662791 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660424 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660427 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660429 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660432 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660434 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660437 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660439 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660450 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660453 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660456 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660459 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660463 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660466 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660469 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660472 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660475 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660477 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660480 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660483 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:08.663355 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660485 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660488 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660490 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660493 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660495 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660498 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660500 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660503 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660505 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660508 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660510 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660513 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660515 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660519 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660522 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660525 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660527 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660530 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:08.663807 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:08.660532 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:08.664280 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.660537 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:08.664280 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.661238 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:41:08.667624 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.667608 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:41:08.668675 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.668663 2579 server.go:1019] "Starting client certificate rotation" Apr 23 17:41:08.668772 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.668755 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:08.668805 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.668792 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:08.695528 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.695507 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:08.700351 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.700333 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:08.721540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.721512 2579 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:41:08.727454 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.727430 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:08.728724 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.728708 2579 log.go:25] "Validated CRI v1 image API" Apr 23 17:41:08.730507 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.730489 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:41:08.734547 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.734528 2579 fs.go:135] Filesystem UUIDs: map[67a957a5-2b26-4d02-b252-0d0ec0aa415d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a81cd879-9bb0-4c23-a6b4-8660bda4c546:/dev/nvme0n1p4] Apr 23 17:41:08.734605 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.734548 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:41:08.739576 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.739471 2579 manager.go:217] Machine: {Timestamp:2026-04-23 17:41:08.738257068 +0000 UTC m=+0.437857931 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093360 MemoryCapacity:32812179456 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26fa9ddcbb512ffc017614ff99a775 SystemUUID:ec26fa9d-dcbb-512f-fc01-7614ff99a775 BootID:69142632-d5e2-46d1-a709-c83ebe29bcf2 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406089728 Type:vfs Inodes:4005393 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562439168 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a5:0d:eb:62:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a5:0d:eb:62:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:00:2c:4f:b6:a1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812179456 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:41:08.739576 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.739573 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:41:08.739715 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.739703 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:41:08.740026 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740007 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:41:08.740171 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740028 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-107.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:41:08.740216 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740183 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:41:08.740216 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740191 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:41:08.740216 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740208 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:08.740296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.740219 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:08.742065 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.742055 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:08.742167 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.742158 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:41:08.744783 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.744773 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:41:08.745437 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.745425 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:41:08.745484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.745446 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:41:08.745484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.745459 2579 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:41:08.745484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.745468 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:41:08.746562 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.746550 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:08.746611 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.746567 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:08.749910 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.749893 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:41:08.751484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.751470 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:41:08.753627 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753615 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753632 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753639 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753645 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753651 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753656 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753662 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753667 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753675 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:41:08.753678 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753682 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:41:08.753908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753691 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:41:08.753908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.753700 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:41:08.754394 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.754385 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:41:08.754394 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.754394 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:41:08.757923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.757888 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-107.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:41:08.758048 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.757952 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:41:08.758048 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.757955 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-107.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:41:08.758048 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758016 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:41:08.758048 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758041 2579 server.go:1295] "Started kubelet" Apr 23 17:41:08.758191 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758129 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:41:08.758240 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758126 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:41:08.758273 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758255 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:41:08.758804 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.758767 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dfpjq" Apr 23 17:41:08.759033 ip-10-0-131-107 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:41:08.759525 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.759501 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:41:08.760866 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.760853 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:41:08.768411 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.768361 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dfpjq" Apr 23 17:41:08.769401 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.769387 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:41:08.769467 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.769406 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:08.769769 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.768339 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-107.ec2.internal.18a90d398a3e8daf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-107.ec2.internal,UID:ip-10-0-131-107.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-107.ec2.internal,},FirstTimestamp:2026-04-23 17:41:08.758023599 +0000 UTC m=+0.457624462,LastTimestamp:2026-04-23 17:41:08.758023599 +0000 UTC m=+0.457624462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-107.ec2.internal,}" Apr 23 17:41:08.770073 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770052 2579 factory.go:55] Registering systemd factory Apr 23 17:41:08.770162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770088 2579 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:41:08.770254 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770224 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:41:08.770254 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770253 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770327 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770345 2579 factory.go:153] Registering CRI-O factory Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770358 2579 factory.go:223] Registration of the crio container factory successfully Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770386 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770394 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:41:08.770422 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.770385 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:08.770664 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770479 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:41:08.770664 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770600 2579 factory.go:103] Registering Raw factory Apr 23 17:41:08.770664 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.770614 2579 manager.go:1196] Started watching for new ooms in manager Apr 23 17:41:08.771057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.771039 2579 manager.go:319] Starting recovery of all containers Apr 23 17:41:08.771825 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.771802 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:41:08.775516 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.775494 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:08.779588 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.779542 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-107.ec2.internal\" not found" node="ip-10-0-131-107.ec2.internal" Apr 23 17:41:08.782041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.782026 2579 manager.go:324] Recovery completed Apr 23 17:41:08.786099 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.785996 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:08.797644 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.797627 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:08.797714 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.797658 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:08.797714 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.797671 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:08.798223 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.798203 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:41:08.798223 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.798222 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:41:08.798327 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.798240 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:08.800900 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.800888 2579 policy_none.go:49] "None policy: Start" Apr 23 17:41:08.800960 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.800903 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:41:08.800960 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.800913 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:41:08.835506 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.835484 2579 manager.go:341] "Starting Device Plugin manager" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.835566 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.835581 2579 server.go:85] "Starting device plugin registration server" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.835795 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.835806 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.836027 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.836115 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.836124 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.836702 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:41:08.846256 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.836756 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:08.900266 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.900188 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:41:08.901589 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.901573 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:41:08.901706 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.901601 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:41:08.901706 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.901622 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:41:08.901706 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.901634 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:41:08.901841 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.901738 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:41:08.904806 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.904788 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:08.936266 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.936246 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:08.937477 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.937463 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:08.937550 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.937501 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:08.937550 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.937513 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:08.937550 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.937541 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-107.ec2.internal" Apr 23 17:41:08.947533 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:08.947510 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-107.ec2.internal" Apr 23 17:41:08.947533 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.947535 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-107.ec2.internal\": node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:08.962425 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:08.962404 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.002799 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.002766 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal"] Apr 23 17:41:09.002871 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.002834 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:09.004348 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.004335 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:09.004405 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.004363 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:09.004405 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.004372 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:09.005699 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.005687 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:09.005841 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.005819 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.005895 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.005849 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:09.006748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006728 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:09.006834 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006760 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:09.006834 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006774 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:09.006834 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006732 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:09.006926 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006841 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:09.006926 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.006855 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:09.008267 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.008252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.008319 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.008281 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:09.008921 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.008904 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:09.009007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.008932 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:09.009007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.008961 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:09.039660 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.039634 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-107.ec2.internal\" not found" node="ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.043918 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.043897 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-107.ec2.internal\" not found" node="ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.063352 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.063328 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.071620 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.071595 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.071700 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.071628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.071700 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.071647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ddd1d509744edf6dd1d0f6ae52b4d7c3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-107.ec2.internal\" (UID: \"ddd1d509744edf6dd1d0f6ae52b4d7c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.163521 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.163433 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.172824 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.172970 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.172970 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ddd1d509744edf6dd1d0f6ae52b4d7c3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-107.ec2.internal\" (UID: \"ddd1d509744edf6dd1d0f6ae52b4d7c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.172970 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.172970 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d88ecd169a393f97760ef199de7a191e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal\" (UID: \"d88ecd169a393f97760ef199de7a191e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.172970 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.172904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ddd1d509744edf6dd1d0f6ae52b4d7c3-config\") pod \"kube-apiserver-proxy-ip-10-0-131-107.ec2.internal\" (UID: \"ddd1d509744edf6dd1d0f6ae52b4d7c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.263888 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.263854 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.342369 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.342342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.347003 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.346978 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:09.364779 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.364752 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.465333 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.465245 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.565758 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.565726 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.593695 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.593675 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:09.666066 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.666027 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.668193 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.668171 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:41:09.668383 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.668353 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:09.668383 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.668368 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:09.668524 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.668368 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:09.766237 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.766207 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.769865 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.769834 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:09.770991 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.770902 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:36:08 +0000 UTC" deadline="2028-01-27 04:06:13.724905296 +0000 UTC" Apr 23 17:41:09.770991 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.770965 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15442h25m3.953952351s" Apr 23 17:41:09.779682 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.779661 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:09.808061 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.808030 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j9chw" Apr 23 17:41:09.815636 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.815607 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j9chw" Apr 23 17:41:09.848011 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:09.847821 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd1d509744edf6dd1d0f6ae52b4d7c3.slice/crio-9d260daae0ef27709b721e1558659dd3f193ea7f9cd35bc3365fe0f079c78cbe WatchSource:0}: Error finding container 9d260daae0ef27709b721e1558659dd3f193ea7f9cd35bc3365fe0f079c78cbe: Status 404 returned error can't find the container with id 9d260daae0ef27709b721e1558659dd3f193ea7f9cd35bc3365fe0f079c78cbe Apr 23 17:41:09.848327 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:09.848308 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88ecd169a393f97760ef199de7a191e.slice/crio-4fae8398a3cce8787c477d7db150b1947ad3510a2c7963416e32a979ffc6077a WatchSource:0}: Error finding container 4fae8398a3cce8787c477d7db150b1947ad3510a2c7963416e32a979ffc6077a: Status 404 returned error can't find the container with id 4fae8398a3cce8787c477d7db150b1947ad3510a2c7963416e32a979ffc6077a Apr 23 17:41:09.852202 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.852189 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:41:09.866506 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.866473 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:09.905075 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.905025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" event={"ID":"d88ecd169a393f97760ef199de7a191e","Type":"ContainerStarted","Data":"4fae8398a3cce8787c477d7db150b1947ad3510a2c7963416e32a979ffc6077a"} Apr 23 17:41:09.905984 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:09.905962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" event={"ID":"ddd1d509744edf6dd1d0f6ae52b4d7c3","Type":"ContainerStarted","Data":"9d260daae0ef27709b721e1558659dd3f193ea7f9cd35bc3365fe0f079c78cbe"} Apr 23 17:41:09.967178 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:09.967152 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:10.067622 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.067583 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:10.168103 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.168063 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:10.268876 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.268834 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-107.ec2.internal\" not found" Apr 23 17:41:10.286453 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.286427 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:10.370572 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.370448 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" Apr 23 17:41:10.382730 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.382700 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:10.383891 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.383865 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" Apr 23 17:41:10.390484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.390463 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:10.746461 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.746393 2579 apiserver.go:52] "Watching apiserver" Apr 23 17:41:10.752016 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.751987 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:41:10.752465 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.752444 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jldlx","openshift-multus/network-metrics-daemon-45qq7","openshift-network-diagnostics/network-check-target-h7vln","kube-system/konnectivity-agent-d8twf","kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal","openshift-multus/multus-additional-cni-plugins-qnskt","openshift-multus/multus-w5pwh","openshift-network-operator/iptables-alerter-hnppj","openshift-ovn-kubernetes/ovnkube-node-zxwk2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh","openshift-cluster-node-tuning-operator/tuned-pcjg6","openshift-dns/node-resolver-thrhm"] Apr 23 17:41:10.754118 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.754090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.755340 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.755314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.755439 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.755398 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.757347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.757411 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rjjlh\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.757452 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.758030 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.758123 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.758030 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.758670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:10.759773 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.758734 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:10.760239 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.759791 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.760239 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.759911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.762079 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.761387 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.762079 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.761871 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:41:10.762079 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.761897 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ck86f\"" Apr 23 17:41:10.762079 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.761913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:41:10.762321 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.762133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.762321 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.762216 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8n4q4\"" Apr 23 17:41:10.762430 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.762359 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.762479 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.762456 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:41:10.762713 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.762694 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.763144 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.763125 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:41:10.763342 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.763324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2gjvj\"" Apr 23 17:41:10.764358 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.764342 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:41:10.765106 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.764976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fkl2c\"" Apr 23 17:41:10.765106 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.764997 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.765106 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.764983 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.766447 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.766427 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.766546 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.766510 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.768119 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.768103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.768748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.768734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.769659 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.769639 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.769756 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.769670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.770017 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.769874 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:41:10.770017 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.769887 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:41:10.770017 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.769965 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:41:10.770222 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.770205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vqfn4\"" Apr 23 17:41:10.770472 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.770433 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:41:10.770544 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.770480 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.770544 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.770443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dxlhg\"" Apr 23 17:41:10.771435 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.771413 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.771518 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.771485 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.771568 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.771536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.771745 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.771730 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q77pz\"" Apr 23 17:41:10.772973 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.772956 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:41:10.774074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.774058 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:41:10.774174 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.774078 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:41:10.774234 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.774218 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-z64c7\"" Apr 23 17:41:10.783081 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-etc-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783173 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-serviceca\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.783173 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783125 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-systemd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783173 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzdh8\" (UniqueName: \"kubernetes.io/projected/239fea19-d73f-4439-bcf2-56f2de8a52fd-kube-api-access-bzdh8\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783176 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783200 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7zg\" (UniqueName: \"kubernetes.io/projected/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-kube-api-access-7f7zg\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/139eafee-efc6-482a-9490-90498d03dec5-iptables-alerter-script\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/139eafee-efc6-482a-9490-90498d03dec5-host-slash\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.783302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-systemd\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-host\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-os-release\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-systemd-units\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95g9f\" (UniqueName: \"kubernetes.io/projected/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-kube-api-access-95g9f\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783503 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-socket-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783544 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-daemon-config\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovn-node-metrics-cert\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-agent-certs\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-socket-dir-parent\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-netns\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-netns\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.783762 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-modprobe-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783813 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-etc-tuned\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-config\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-os-release\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-system-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-etc-kubernetes\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-node-log\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.783997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-netd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-var-lib-kubelet\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784048 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvnz\" (UniqueName: \"kubernetes.io/projected/5b02fbde-34d7-498c-ba52-33a9307442e3-kube-api-access-9nvnz\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-hostroot\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-conf-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-slash\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-sys\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-lib-modules\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784208 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-system-cni-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-cni-binary-copy\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-multus-certs\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2k5c\" (UniqueName: \"kubernetes.io/projected/2a3e43e9-302e-40d9-ac72-286f11253b7c-kube-api-access-w2k5c\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-ovn\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysconfig\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-conf\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-host\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/994fa59f-956e-4d6b-8074-e3d2459771d9-tmp-dir\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-script-lib\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784555 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-run\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784665 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-tmp\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smr8c\" (UniqueName: \"kubernetes.io/projected/994fa59f-956e-4d6b-8074-e3d2459771d9-kube-api-access-smr8c\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784714 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-bin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-log-socket\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-registration-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-kubernetes\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/994fa59f-956e-4d6b-8074-e3d2459771d9-hosts-file\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/51d0c740-3b6f-4927-90d0-03577afcf352-kube-api-access-dwlcc\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-kubelet\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.784931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-kubelet\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhvz\" (UniqueName: \"kubernetes.io/projected/c79a7d29-1318-4068-9105-f9bf27e21682-kube-api-access-sdhvz\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-cnibin\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785070 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-k8s-cni-cncf-io\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-var-lib-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-bin\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-env-overrides\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-konnectivity-ca\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-multus\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmgw\" (UniqueName: \"kubernetes.io/projected/139eafee-efc6-482a-9490-90498d03dec5-kube-api-access-rtmgw\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-device-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.785397 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-sys-fs\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.785990 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.785990 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.785362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-cnibin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.816335 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.816208 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:09 +0000 UTC" deadline="2027-12-16 13:14:49.491700528 +0000 UTC" Apr 23 17:41:10.816335 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.816236 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14443h33m38.675467524s" Apr 23 17:41:10.860371 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.860346 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:10.871020 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.870992 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:41:10.886139 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-netns\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-modprobe-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886179 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-etc-tuned\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-netns\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-config\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-os-release\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-system-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886283 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-modprobe-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-etc-kubernetes\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886339 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-etc-kubernetes\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-d\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-node-log\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-node-log\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-os-release\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-system-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.886548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-netd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-var-lib-kubelet\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvnz\" (UniqueName: \"kubernetes.io/projected/5b02fbde-34d7-498c-ba52-33a9307442e3-kube-api-access-9nvnz\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-hostroot\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-conf-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-slash\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-sys\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-lib-modules\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-system-cni-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886741 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-cni-binary-copy\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-multus-certs\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2k5c\" (UniqueName: \"kubernetes.io/projected/2a3e43e9-302e-40d9-ac72-286f11253b7c-kube-api-access-w2k5c\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-ovn\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysconfig\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-conf\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-host\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886957 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/994fa59f-956e-4d6b-8074-e3d2459771d9-tmp-dir\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.886992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-netd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-config\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-script-lib\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysconfig\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-sys\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-system-cni-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-conf-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-run\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887158 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-lib-modules\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-ovn\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-multus-certs\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887180 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-hostroot\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-tmp\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-host\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-run\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smr8c\" (UniqueName: \"kubernetes.io/projected/994fa59f-956e-4d6b-8074-e3d2459771d9-kube-api-access-smr8c\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-var-lib-kubelet\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.887908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-slash\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-bin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-sysctl-conf\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-bin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-cni-binary-copy\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-log-socket\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-log-socket\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovnkube-script-lib\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-registration-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-kubernetes\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-registration-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/994fa59f-956e-4d6b-8074-e3d2459771d9-hosts-file\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-kubernetes\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/51d0c740-3b6f-4927-90d0-03577afcf352-kube-api-access-dwlcc\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/994fa59f-956e-4d6b-8074-e3d2459771d9-hosts-file\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-cni-dir\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.888709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-kubelet\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-kubelet\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhvz\" (UniqueName: \"kubernetes.io/projected/c79a7d29-1318-4068-9105-f9bf27e21682-kube-api-access-sdhvz\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-kubelet\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-kubelet\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-cnibin\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.887988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-cnibin\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-k8s-cni-cncf-io\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-var-lib-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-bin\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-env-overrides\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-konnectivity-ca\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-k8s-cni-cncf-io\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-var-lib-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-multus\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-cni-bin\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888251 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-var-lib-cni-multus\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.889537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmgw\" (UniqueName: \"kubernetes.io/projected/139eafee-efc6-482a-9490-90498d03dec5-kube-api-access-rtmgw\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-device-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-sys-fs\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-device-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-cnibin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-etc-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-sys-fs\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-serviceca\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-systemd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888492 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-cnibin\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdh8\" (UniqueName: \"kubernetes.io/projected/239fea19-d73f-4439-bcf2-56f2de8a52fd-kube-api-access-bzdh8\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7zg\" (UniqueName: \"kubernetes.io/projected/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-kube-api-access-7f7zg\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/139eafee-efc6-482a-9490-90498d03dec5-iptables-alerter-script\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-env-overrides\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.890418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/139eafee-efc6-482a-9490-90498d03dec5-host-slash\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-systemd\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-host\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-os-release\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-systemd-units\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95g9f\" (UniqueName: \"kubernetes.io/projected/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-kube-api-access-95g9f\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-konnectivity-ca\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c79a7d29-1318-4068-9105-f9bf27e21682-etc-systemd\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/994fa59f-956e-4d6b-8074-e3d2459771d9-tmp-dir\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888858 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-socket-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-host\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-os-release\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.888967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-etc-selinux\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.891057 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-run-systemd\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889112 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-etc-openvswitch\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-daemon-config\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovn-node-metrics-cert\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-agent-certs\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-socket-dir-parent\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-netns\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/239fea19-d73f-4439-bcf2-56f2de8a52fd-socket-dir\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889506 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-systemd-units\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b02fbde-34d7-498c-ba52-33a9307442e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-serviceca\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-daemon-config\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.891703 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.889569 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b02fbde-34d7-498c-ba52-33a9307442e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/139eafee-efc6-482a-9490-90498d03dec5-host-slash\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-multus-socket-dir-parent\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.889678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a3e43e9-302e-40d9-ac72-286f11253b7c-host-run-netns\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.889748 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:11.389712379 +0000 UTC m=+3.089313230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.890284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/139eafee-efc6-482a-9490-90498d03dec5-iptables-alerter-script\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.890563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-etc-tuned\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.892338 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.890602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79a7d29-1318-4068-9105-f9bf27e21682-tmp\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.892651 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.892540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b2c8d6fa-fde8-484b-bc81-f7412492a7fa-agent-certs\") pod \"konnectivity-agent-d8twf\" (UID: \"b2c8d6fa-fde8-484b-bc81-f7412492a7fa\") " pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:10.893621 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.893587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-ovn-node-metrics-cert\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.896868 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.896811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smr8c\" (UniqueName: \"kubernetes.io/projected/994fa59f-956e-4d6b-8074-e3d2459771d9-kube-api-access-smr8c\") pod \"node-resolver-thrhm\" (UID: \"994fa59f-956e-4d6b-8074-e3d2459771d9\") " pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:10.897086 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.897066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/51d0c740-3b6f-4927-90d0-03577afcf352-kube-api-access-dwlcc\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:10.897549 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.897520 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95g9f\" (UniqueName: \"kubernetes.io/projected/b82ed798-7ebc-40ee-8b35-03a742ad4e5e-kube-api-access-95g9f\") pod \"ovnkube-node-zxwk2\" (UID: \"b82ed798-7ebc-40ee-8b35-03a742ad4e5e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:10.898058 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.898033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvnz\" (UniqueName: \"kubernetes.io/projected/5b02fbde-34d7-498c-ba52-33a9307442e3-kube-api-access-9nvnz\") pod \"multus-additional-cni-plugins-qnskt\" (UID: \"5b02fbde-34d7-498c-ba52-33a9307442e3\") " pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:10.898424 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.898406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2k5c\" (UniqueName: \"kubernetes.io/projected/2a3e43e9-302e-40d9-ac72-286f11253b7c-kube-api-access-w2k5c\") pod \"multus-w5pwh\" (UID: \"2a3e43e9-302e-40d9-ac72-286f11253b7c\") " pod="openshift-multus/multus-w5pwh" Apr 23 17:41:10.899023 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.899005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhvz\" (UniqueName: \"kubernetes.io/projected/c79a7d29-1318-4068-9105-f9bf27e21682-kube-api-access-sdhvz\") pod \"tuned-pcjg6\" (UID: \"c79a7d29-1318-4068-9105-f9bf27e21682\") " pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:10.901960 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.901918 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:10.901960 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.901962 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:10.902120 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.901977 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:10.902120 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:10.902032 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:11.402015009 +0000 UTC m=+3.101615859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:10.903442 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.903418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmgw\" (UniqueName: \"kubernetes.io/projected/139eafee-efc6-482a-9490-90498d03dec5-kube-api-access-rtmgw\") pod \"iptables-alerter-hnppj\" (UID: \"139eafee-efc6-482a-9490-90498d03dec5\") " pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:10.904566 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.904543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7zg\" (UniqueName: \"kubernetes.io/projected/672ee16e-ccf3-47b3-a727-a91e0e7a9fbc-kube-api-access-7f7zg\") pod \"node-ca-jldlx\" (UID: \"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc\") " pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:10.904659 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:10.904647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdh8\" (UniqueName: \"kubernetes.io/projected/239fea19-d73f-4439-bcf2-56f2de8a52fd-kube-api-access-bzdh8\") pod \"aws-ebs-csi-driver-node-k6gbh\" (UID: \"239fea19-d73f-4439-bcf2-56f2de8a52fd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:11.068097 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.068063 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qnskt" Apr 23 17:41:11.077037 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.077009 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:11.080875 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.080854 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:11.084618 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.084594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jldlx" Apr 23 17:41:11.091233 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.091214 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w5pwh" Apr 23 17:41:11.099900 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.099875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hnppj" Apr 23 17:41:11.106576 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.106556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:11.113105 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.113089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" Apr 23 17:41:11.119780 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.119765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" Apr 23 17:41:11.126323 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.126306 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-thrhm" Apr 23 17:41:11.393286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.393197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:11.393415 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.393352 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:11.393415 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.393405 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:12.393391121 +0000 UTC m=+4.092991970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:11.477776 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.477745 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994fa59f_956e_4d6b_8074_e3d2459771d9.slice/crio-5d65858ac001a58f02378eb3a4e791fd66c7144f2fdf444cc62c5a43f82b1d69 WatchSource:0}: Error finding container 5d65858ac001a58f02378eb3a4e791fd66c7144f2fdf444cc62c5a43f82b1d69: Status 404 returned error can't find the container with id 5d65858ac001a58f02378eb3a4e791fd66c7144f2fdf444cc62c5a43f82b1d69 Apr 23 17:41:11.479520 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.479492 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod672ee16e_ccf3_47b3_a727_a91e0e7a9fbc.slice/crio-882b84ca22d65405c56eaffee5626621b45256c0c2eadd077fb45363d7dee822 WatchSource:0}: Error finding container 882b84ca22d65405c56eaffee5626621b45256c0c2eadd077fb45363d7dee822: Status 404 returned error can't find the container with id 882b84ca22d65405c56eaffee5626621b45256c0c2eadd077fb45363d7dee822 Apr 23 17:41:11.483755 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.483488 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c8d6fa_fde8_484b_bc81_f7412492a7fa.slice/crio-10f0db4a4e98c5e9067ea9c88f362f5fac37acae5b3d3190dd063656942f17ec WatchSource:0}: Error finding container 10f0db4a4e98c5e9067ea9c88f362f5fac37acae5b3d3190dd063656942f17ec: Status 404 returned error can't find the container with id 10f0db4a4e98c5e9067ea9c88f362f5fac37acae5b3d3190dd063656942f17ec Apr 23 17:41:11.485989 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.485960 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139eafee_efc6_482a_9490_90498d03dec5.slice/crio-955857b0dbbff1cdd28f8a73a595958bffe3173077306abf16684e5f2c69cd57 WatchSource:0}: Error finding container 955857b0dbbff1cdd28f8a73a595958bffe3173077306abf16684e5f2c69cd57: Status 404 returned error can't find the container with id 955857b0dbbff1cdd28f8a73a595958bffe3173077306abf16684e5f2c69cd57 Apr 23 17:41:11.486564 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.486545 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239fea19_d73f_4439_bcf2_56f2de8a52fd.slice/crio-3b61a7ee937a70c67f2919e8a73c7b241d282ae25634d077f29b4097b447b162 WatchSource:0}: Error finding container 3b61a7ee937a70c67f2919e8a73c7b241d282ae25634d077f29b4097b447b162: Status 404 returned error can't find the container with id 3b61a7ee937a70c67f2919e8a73c7b241d282ae25634d077f29b4097b447b162 Apr 23 17:41:11.487473 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.487442 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3e43e9_302e_40d9_ac72_286f11253b7c.slice/crio-fb48d3ec8788e36b8aba788e5a2389c149e551f11dbfb8a8a7abbfb6fa625a2d WatchSource:0}: Error finding container fb48d3ec8788e36b8aba788e5a2389c149e551f11dbfb8a8a7abbfb6fa625a2d: Status 404 returned error can't find the container with id fb48d3ec8788e36b8aba788e5a2389c149e551f11dbfb8a8a7abbfb6fa625a2d Apr 23 17:41:11.488539 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.488483 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b02fbde_34d7_498c_ba52_33a9307442e3.slice/crio-5109ace1e2a049e2ade8a1c2146c0586bd2eae99903a0fe1e3f5f16dc48e53d1 WatchSource:0}: Error finding container 5109ace1e2a049e2ade8a1c2146c0586bd2eae99903a0fe1e3f5f16dc48e53d1: Status 404 returned error can't find the container with id 5109ace1e2a049e2ade8a1c2146c0586bd2eae99903a0fe1e3f5f16dc48e53d1 Apr 23 17:41:11.489308 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:11.489257 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79a7d29_1318_4068_9105_f9bf27e21682.slice/crio-45074ee67ec2b1d1cd439b2d9e97ff796e92b5b317fdee86dd520ca0544782e4 WatchSource:0}: Error finding container 45074ee67ec2b1d1cd439b2d9e97ff796e92b5b317fdee86dd520ca0544782e4: Status 404 returned error can't find the container with id 45074ee67ec2b1d1cd439b2d9e97ff796e92b5b317fdee86dd520ca0544782e4 Apr 23 17:41:11.493797 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.493775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:11.493927 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.493910 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:11.494009 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.493956 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:11.494009 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.493970 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:11.494116 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.494033 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:12.494012888 +0000 UTC m=+4.193613738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:11.816564 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.816518 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:09 +0000 UTC" deadline="2027-11-22 13:14:27.215092141 +0000 UTC" Apr 23 17:41:11.816564 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.816559 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13867h33m15.398537137s" Apr 23 17:41:11.902364 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.902254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:11.902524 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:11.902398 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:11.919106 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.918415 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" event={"ID":"ddd1d509744edf6dd1d0f6ae52b4d7c3","Type":"ContainerStarted","Data":"f2e7f7124f823ce321f8799011bad45aef8b62f26df5157ca265cc1b9097ff0c"} Apr 23 17:41:11.926438 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.926371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"8bebf593d4fdac981250104f81ada55ec11c51b3781603a7b9b29cdfe104224b"} Apr 23 17:41:11.930921 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.930690 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerStarted","Data":"5109ace1e2a049e2ade8a1c2146c0586bd2eae99903a0fe1e3f5f16dc48e53d1"} Apr 23 17:41:11.938164 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.938104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" event={"ID":"239fea19-d73f-4439-bcf2-56f2de8a52fd","Type":"ContainerStarted","Data":"3b61a7ee937a70c67f2919e8a73c7b241d282ae25634d077f29b4097b447b162"} Apr 23 17:41:11.955654 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.955596 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hnppj" event={"ID":"139eafee-efc6-482a-9490-90498d03dec5","Type":"ContainerStarted","Data":"955857b0dbbff1cdd28f8a73a595958bffe3173077306abf16684e5f2c69cd57"} Apr 23 17:41:11.966817 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.966749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jldlx" event={"ID":"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc","Type":"ContainerStarted","Data":"882b84ca22d65405c56eaffee5626621b45256c0c2eadd077fb45363d7dee822"} Apr 23 17:41:11.968690 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.968624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" event={"ID":"c79a7d29-1318-4068-9105-f9bf27e21682","Type":"ContainerStarted","Data":"45074ee67ec2b1d1cd439b2d9e97ff796e92b5b317fdee86dd520ca0544782e4"} Apr 23 17:41:11.974205 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.974124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w5pwh" event={"ID":"2a3e43e9-302e-40d9-ac72-286f11253b7c","Type":"ContainerStarted","Data":"fb48d3ec8788e36b8aba788e5a2389c149e551f11dbfb8a8a7abbfb6fa625a2d"} Apr 23 17:41:11.990738 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.990687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d8twf" event={"ID":"b2c8d6fa-fde8-484b-bc81-f7412492a7fa","Type":"ContainerStarted","Data":"10f0db4a4e98c5e9067ea9c88f362f5fac37acae5b3d3190dd063656942f17ec"} Apr 23 17:41:11.992356 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:11.992309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-thrhm" event={"ID":"994fa59f-956e-4d6b-8074-e3d2459771d9","Type":"ContainerStarted","Data":"5d65858ac001a58f02378eb3a4e791fd66c7144f2fdf444cc62c5a43f82b1d69"} Apr 23 17:41:12.065841 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:12.065809 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:12.401248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:12.400587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:12.401248 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.400752 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:12.401248 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.400819 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:14.400800056 +0000 UTC m=+6.100400912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:12.501413 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:12.501320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:12.501636 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.501523 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:12.501636 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.501549 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:12.501636 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.501564 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:12.501800 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.501642 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:14.501613128 +0000 UTC m=+6.201213980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:12.903204 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:12.903169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:12.903677 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:12.903299 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:13.014662 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:13.014124 2579 generic.go:358] "Generic (PLEG): container finished" podID="d88ecd169a393f97760ef199de7a191e" containerID="675aa9e6bb7f22ce795e4b621ea5ca8aef369d9542590d04d51e6da535f256e3" exitCode=0 Apr 23 17:41:13.014662 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:13.014295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" event={"ID":"d88ecd169a393f97760ef199de7a191e","Type":"ContainerDied","Data":"675aa9e6bb7f22ce795e4b621ea5ca8aef369d9542590d04d51e6da535f256e3"} Apr 23 17:41:13.029811 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:13.029758 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-107.ec2.internal" podStartSLOduration=3.029735136 podStartE2EDuration="3.029735136s" podCreationTimestamp="2026-04-23 17:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:11.932411483 +0000 UTC m=+3.632012359" watchObservedRunningTime="2026-04-23 17:41:13.029735136 +0000 UTC m=+4.729336009" Apr 23 17:41:13.902336 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:13.902298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:13.902591 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:13.902445 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:14.022505 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:14.022468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" event={"ID":"d88ecd169a393f97760ef199de7a191e","Type":"ContainerStarted","Data":"907a34732db53cd3dafc63106b037188a285b9e8481fb9c2a7c3283b109e4756"} Apr 23 17:41:14.036157 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:14.036105 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-107.ec2.internal" podStartSLOduration=4.036087127 podStartE2EDuration="4.036087127s" podCreationTimestamp="2026-04-23 17:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:14.036086555 +0000 UTC m=+5.735687430" watchObservedRunningTime="2026-04-23 17:41:14.036087127 +0000 UTC m=+5.735688001" Apr 23 17:41:14.415849 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:14.415807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:14.416062 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.416023 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:14.416130 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.416099 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:18.416080145 +0000 UTC m=+10.115680998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:14.516199 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:14.516158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:14.516440 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.516415 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:14.516512 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.516449 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:14.516512 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.516466 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:14.516624 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.516527 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:18.516512679 +0000 UTC m=+10.216113529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:14.902667 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:14.902637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:14.902841 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:14.902761 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:15.901999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:15.901962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:15.902443 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:15.902109 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:16.905146 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:16.904653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:16.905146 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:16.904775 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:17.902445 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:17.901962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:17.902445 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:17.902092 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:18.448849 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:18.448804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:18.449420 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.448990 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:18.449420 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.449068 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:26.449044912 +0000 UTC m=+18.148645773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:18.549431 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:18.549359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:18.549605 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.549542 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:18.549605 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.549565 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:18.549605 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.549582 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:18.549771 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.549644 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:26.549625382 +0000 UTC m=+18.249226235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:18.902818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:18.902782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:18.902998 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:18.902867 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:19.902661 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:19.902621 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:19.903148 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:19.902775 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:20.903168 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:20.902639 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:20.903168 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:20.902773 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:21.902248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:21.902214 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:21.902420 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:21.902357 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:22.902241 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:22.902208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:22.902652 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:22.902318 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:23.901884 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:23.901846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:23.902087 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:23.901977 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:24.903199 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:24.902954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:24.903623 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:24.903302 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:25.902651 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:25.902614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:25.902845 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:25.902740 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:26.514203 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:26.514164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:26.514598 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.514355 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:26.514598 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.514440 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.514417867 +0000 UTC m=+34.214018726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:26.615156 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:26.615115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:26.615324 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.615255 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:26.615324 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.615272 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:26.615324 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.615281 2579 projected.go:194] Error preparing data for projected volume kube-api-access-trz4v for pod openshift-network-diagnostics/network-check-target-h7vln: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:26.615492 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.615340 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v podName:7f6fcfee-1046-4e89-a5c4-e3d550d4056f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.615322069 +0000 UTC m=+34.314922921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-trz4v" (UniqueName: "kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v") pod "network-check-target-h7vln" (UID: "7f6fcfee-1046-4e89-a5c4-e3d550d4056f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:26.902114 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:26.902042 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:26.902254 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:26.902148 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:27.902180 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:27.902144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:27.902579 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:27.902264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:28.904277 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:28.903743 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:28.904277 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:28.903854 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:29.049928 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.049890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" event={"ID":"c79a7d29-1318-4068-9105-f9bf27e21682","Type":"ContainerStarted","Data":"56fdd64c98eb01a1b190edd15a0bd5e48519a2ec36d21f1af404bfb295389174"} Apr 23 17:41:29.055028 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.054903 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w5pwh" event={"ID":"2a3e43e9-302e-40d9-ac72-286f11253b7c","Type":"ContainerStarted","Data":"4aafb20dd3e5b5260883b84ccf63f0123bf627c8302d9fd0a5a68cada5e9169d"} Apr 23 17:41:29.057326 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.057297 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"e2f649f7a5e5f998911cd9bc4a32b772c5c8652be2770bd49266579c0d100772"} Apr 23 17:41:29.061558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.061035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerStarted","Data":"d7175f4fa55a1c2ca2479c7790cd2f17db437be93fed5e13a0abaf00af11d58c"} Apr 23 17:41:29.068460 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.067883 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pcjg6" podStartSLOduration=2.7566364439999997 podStartE2EDuration="20.067864809s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.492790841 +0000 UTC m=+3.192391710" lastFinishedPulling="2026-04-23 17:41:28.804019211 +0000 UTC m=+20.503620075" observedRunningTime="2026-04-23 17:41:29.067298949 +0000 UTC m=+20.766899821" watchObservedRunningTime="2026-04-23 17:41:29.067864809 +0000 UTC m=+20.767465683" Apr 23 17:41:29.902918 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:29.902738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:29.903063 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:29.903017 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:30.064162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.064125 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d8twf" event={"ID":"b2c8d6fa-fde8-484b-bc81-f7412492a7fa","Type":"ContainerStarted","Data":"749ab0422174b73dd6cc82ab73a3ed724eac3d2fc4e2cf5f7453d32038ee5813"} Apr 23 17:41:30.065449 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.065421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-thrhm" event={"ID":"994fa59f-956e-4d6b-8074-e3d2459771d9","Type":"ContainerStarted","Data":"9bc1f3b3a55bd494f8f903fabb960e6c50f289fa8834d41bb83f5a832215dafb"} Apr 23 17:41:30.067623 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.067606 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:41:30.067952 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.067910 2579 generic.go:358] "Generic (PLEG): container finished" podID="b82ed798-7ebc-40ee-8b35-03a742ad4e5e" containerID="f5cb4a780029325f849412fb53d9f4277d21d9e88258dc7f30ec389431aeb19c" exitCode=1 Apr 23 17:41:30.068066 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.067963 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"54772edcda7dc7f067e18bb373f80fab1b58386cc9ab6fa099bc77bdfab493d9"} Apr 23 17:41:30.068066 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.067990 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"a1a69dd1c77c89dd0ee4f133a0ff56790cebd2433acf69a426ab2fb105d05ffb"} Apr 23 17:41:30.068066 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.068005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"a594b88d5f97583f57018c2df332871d40315cac44c9a40f6345fd0d538a4fab"} Apr 23 17:41:30.068066 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.068022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"a19996500a065b076c280325770b36ef9a14483c359d57c659c74df8153c1811"} Apr 23 17:41:30.068066 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.068033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerDied","Data":"f5cb4a780029325f849412fb53d9f4277d21d9e88258dc7f30ec389431aeb19c"} Apr 23 17:41:30.069715 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.069690 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="d7175f4fa55a1c2ca2479c7790cd2f17db437be93fed5e13a0abaf00af11d58c" exitCode=0 Apr 23 17:41:30.070011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.069981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"d7175f4fa55a1c2ca2479c7790cd2f17db437be93fed5e13a0abaf00af11d58c"} Apr 23 17:41:30.071725 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.071705 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" event={"ID":"239fea19-d73f-4439-bcf2-56f2de8a52fd","Type":"ContainerStarted","Data":"e0ad3ee62e263d71c718773a2c37e5f6c6bcc778d0db8031cfb6464447aa6a88"} Apr 23 17:41:30.072953 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.072915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jldlx" event={"ID":"672ee16e-ccf3-47b3-a727-a91e0e7a9fbc","Type":"ContainerStarted","Data":"881d96afd38df8d3ef01bde81fdee1e9be76c7f57addfb390c1cc41b89f75e85"} Apr 23 17:41:30.077929 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.077893 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w5pwh" podStartSLOduration=3.745727242 podStartE2EDuration="21.077882475s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.489736902 +0000 UTC m=+3.189337765" lastFinishedPulling="2026-04-23 17:41:28.82189213 +0000 UTC m=+20.521492998" observedRunningTime="2026-04-23 17:41:29.104902185 +0000 UTC m=+20.804503057" watchObservedRunningTime="2026-04-23 17:41:30.077882475 +0000 UTC m=+21.777483346" Apr 23 17:41:30.078365 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.078342 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-d8twf" podStartSLOduration=3.919088609 podStartE2EDuration="21.07833792s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.485259157 +0000 UTC m=+3.184860006" lastFinishedPulling="2026-04-23 17:41:28.644508453 +0000 UTC m=+20.344109317" observedRunningTime="2026-04-23 17:41:30.077959497 +0000 UTC m=+21.777560367" watchObservedRunningTime="2026-04-23 17:41:30.07833792 +0000 UTC m=+21.777938792" Apr 23 17:41:30.109583 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.109510 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jldlx" podStartSLOduration=3.924304716 podStartE2EDuration="21.10949713s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.481598352 +0000 UTC m=+3.181199201" lastFinishedPulling="2026-04-23 17:41:28.666790751 +0000 UTC m=+20.366391615" observedRunningTime="2026-04-23 17:41:30.109402773 +0000 UTC m=+21.809003642" watchObservedRunningTime="2026-04-23 17:41:30.10949713 +0000 UTC m=+21.809097981" Apr 23 17:41:30.446568 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.446404 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:41:30.848896 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.848652 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:41:30.446563611Z","UUID":"a06a700e-bd93-485e-9a4d-ec522f08838a","Handler":null,"Name":"","Endpoint":""} Apr 23 17:41:30.853632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.853605 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:41:30.853788 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.853641 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:41:30.904467 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:30.904435 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:30.904672 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:30.904553 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:31.076326 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.076288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" event={"ID":"239fea19-d73f-4439-bcf2-56f2de8a52fd","Type":"ContainerStarted","Data":"805eadc5b66d04e23ca2a1b297678339253b5dcfa25e8f5c7c15493fe22b78bc"} Apr 23 17:41:31.077697 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.077673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hnppj" event={"ID":"139eafee-efc6-482a-9490-90498d03dec5","Type":"ContainerStarted","Data":"a99cc112d23d8d3da967b66374ec93364377ad6f658de709dc555842db7b1270"} Apr 23 17:41:31.090772 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.090707 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-thrhm" podStartSLOduration=4.7488586470000005 podStartE2EDuration="22.090689245s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.480068649 +0000 UTC m=+3.179669510" lastFinishedPulling="2026-04-23 17:41:28.821899257 +0000 UTC m=+20.521500108" observedRunningTime="2026-04-23 17:41:30.122223877 +0000 UTC m=+21.821824749" watchObservedRunningTime="2026-04-23 17:41:31.090689245 +0000 UTC m=+22.790290118" Apr 23 17:41:31.091217 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.091187 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hnppj" podStartSLOduration=4.774867836 podStartE2EDuration="22.091180991s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.487704418 +0000 UTC m=+3.187305269" lastFinishedPulling="2026-04-23 17:41:28.804017575 +0000 UTC m=+20.503618424" observedRunningTime="2026-04-23 17:41:31.090553323 +0000 UTC m=+22.790154196" watchObservedRunningTime="2026-04-23 17:41:31.091180991 +0000 UTC m=+22.790781916" Apr 23 17:41:31.362295 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.362255 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:31.902514 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:31.902433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:31.902663 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:31.902580 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:32.082372 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.082341 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:41:32.082923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.082712 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"d3d41cedf6d01e3c9c334afdad9205282abc4250cf3e073f2cbb16a5a7ad934d"} Apr 23 17:41:32.084699 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.084671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" event={"ID":"239fea19-d73f-4439-bcf2-56f2de8a52fd","Type":"ContainerStarted","Data":"0a645283fbf93f7b56eda39872e96dcb2d9bf989b4ef7b0c0cb304c197297dc2"} Apr 23 17:41:32.100863 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.100821 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k6gbh" podStartSLOduration=3.095256317 podStartE2EDuration="23.100808506s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.488730556 +0000 UTC m=+3.188331413" lastFinishedPulling="2026-04-23 17:41:31.494282751 +0000 UTC m=+23.193883602" observedRunningTime="2026-04-23 17:41:32.100705098 +0000 UTC m=+23.800305970" watchObservedRunningTime="2026-04-23 17:41:32.100808506 +0000 UTC m=+23.800409377" Apr 23 17:41:32.639329 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.639292 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:32.640141 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.640120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:32.902605 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:32.902515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:32.902762 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:32.902659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:33.086903 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:33.086879 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-d8twf" Apr 23 17:41:33.902210 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:33.902176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:33.902385 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:33.902299 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:34.902285 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:34.902105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:34.902747 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:34.902355 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:35.091479 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.091451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:41:35.091765 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.091735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"cf3cf1c7d7c0adf7a644b3e0cc7748c18b26e4c7ab84fccea00057dfb5d1dac4"} Apr 23 17:41:35.092049 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.092030 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:35.092164 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.092059 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:35.092164 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.092073 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:35.092264 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.092240 2579 scope.go:117] "RemoveContainer" containerID="f5cb4a780029325f849412fb53d9f4277d21d9e88258dc7f30ec389431aeb19c" Apr 23 17:41:35.094367 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.094344 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="0b8e41114c9f6204af569243613f8e61f3c012d5cde386f152df5bf9b602eef8" exitCode=0 Apr 23 17:41:35.094466 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.094435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"0b8e41114c9f6204af569243613f8e61f3c012d5cde386f152df5bf9b602eef8"} Apr 23 17:41:35.108788 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.108767 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:35.112407 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.112389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:41:35.902157 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:35.901998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:35.902285 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:35.902225 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:36.025792 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.025762 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h7vln"] Apr 23 17:41:36.026214 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.025873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:36.026214 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:36.025974 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:36.027882 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.027862 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-45qq7"] Apr 23 17:41:36.099706 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.099681 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:41:36.100054 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.100023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" event={"ID":"b82ed798-7ebc-40ee-8b35-03a742ad4e5e","Type":"ContainerStarted","Data":"88f52560615a47149c506de986eba5e4d6d091cc56ac13496efc416cb5b34042"} Apr 23 17:41:36.101905 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.101885 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="1bd8f3a76d669453a297125fdac9ab65e7ee56366cb3a7a5aef57dfa788dcdb3" exitCode=0 Apr 23 17:41:36.102011 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.101971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"1bd8f3a76d669453a297125fdac9ab65e7ee56366cb3a7a5aef57dfa788dcdb3"} Apr 23 17:41:36.102088 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.102073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:36.102193 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:36.102176 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:36.120890 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:36.120849 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" podStartSLOduration=9.731441588 podStartE2EDuration="27.120834538s" podCreationTimestamp="2026-04-23 17:41:09 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.491987788 +0000 UTC m=+3.191588654" lastFinishedPulling="2026-04-23 17:41:28.881380741 +0000 UTC m=+20.580981604" observedRunningTime="2026-04-23 17:41:36.119447417 +0000 UTC m=+27.819048289" watchObservedRunningTime="2026-04-23 17:41:36.120834538 +0000 UTC m=+27.820435410" Apr 23 17:41:37.106554 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:37.106465 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="e031e50f0833baa224737e274043b3e64f9befa430f5161b44efb95fa2ee181a" exitCode=0 Apr 23 17:41:37.106554 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:37.106539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"e031e50f0833baa224737e274043b3e64f9befa430f5161b44efb95fa2ee181a"} Apr 23 17:41:37.902548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:37.902502 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:37.902711 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:37.902516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:37.902711 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:37.902664 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:37.902785 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:37.902734 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:39.902098 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:39.902067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:39.902768 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:39.902066 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:39.902768 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:39.902217 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:41:39.902768 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:39.902289 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h7vln" podUID="7f6fcfee-1046-4e89-a5c4-e3d550d4056f" Apr 23 17:41:41.598070 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.598046 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-107.ec2.internal" event="NodeReady" Apr 23 17:41:41.598479 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.598189 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:41:41.657224 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.657190 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7trf4"] Apr 23 17:41:41.661695 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.661669 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.663961 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.663704 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:41:41.663961 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.663790 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b6vkm\"" Apr 23 17:41:41.664142 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.664045 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:41:41.664389 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.664370 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g4zr9"] Apr 23 17:41:41.667418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.667164 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:41.670098 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.670059 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:41:41.670581 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.670328 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bmxrf\"" Apr 23 17:41:41.670581 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.670350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:41:41.670714 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.670641 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7trf4"] Apr 23 17:41:41.671143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.671115 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:41:41.681059 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.681036 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4zr9"] Apr 23 17:41:41.821767 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4rj\" (UniqueName: \"kubernetes.io/projected/abe92563-1ac2-4f25-b349-ffe1fce0022f-kube-api-access-rb4rj\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:41.821767 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9288ab5-ccb4-416b-aa52-180278252652-tmp-dir\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.821999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkvr\" (UniqueName: \"kubernetes.io/projected/d9288ab5-ccb4-416b-aa52-180278252652-kube-api-access-fkkvr\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.821999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9288ab5-ccb4-416b-aa52-180278252652-config-volume\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.821999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:41.821999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.821910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.902336 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.902299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:41.902516 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.902417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:41.905170 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.905032 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:41:41.905170 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.905046 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:41:41.905170 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.905033 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:41:41.905410 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.905378 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kjp2q\"" Apr 23 17:41:41.905458 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.905377 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qk5s8\"" Apr 23 17:41:41.922774 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.922752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.922911 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.922794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4rj\" (UniqueName: \"kubernetes.io/projected/abe92563-1ac2-4f25-b349-ffe1fce0022f-kube-api-access-rb4rj\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:41.922911 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.922815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9288ab5-ccb4-416b-aa52-180278252652-tmp-dir\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.923054 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:41.922907 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:41.923054 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.922984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkvr\" (UniqueName: \"kubernetes.io/projected/d9288ab5-ccb4-416b-aa52-180278252652-kube-api-access-fkkvr\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.923054 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:41.922998 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.422975867 +0000 UTC m=+34.122576730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:41:41.923054 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.923036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9288ab5-ccb4-416b-aa52-180278252652-config-volume\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.923253 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.923076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:41.923253 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.923106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9288ab5-ccb4-416b-aa52-180278252652-tmp-dir\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.923253 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:41.923174 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:41.923253 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:41.923211 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.423198925 +0000 UTC m=+34.122799780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:41.923588 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.923555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9288ab5-ccb4-416b-aa52-180278252652-config-volume\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.933928 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.933748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkvr\" (UniqueName: \"kubernetes.io/projected/d9288ab5-ccb4-416b-aa52-180278252652-kube-api-access-fkkvr\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:41.934133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:41.933779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4rj\" (UniqueName: \"kubernetes.io/projected/abe92563-1ac2-4f25-b349-ffe1fce0022f-kube-api-access-rb4rj\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:42.427924 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.427880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:42.428130 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.428028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:42.428130 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.428051 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:42.428249 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.428132 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:43.428111449 +0000 UTC m=+35.127712303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:41:42.428249 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.428140 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:42.428249 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.428203 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:43.428188408 +0000 UTC m=+35.127789260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:42.529222 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.529182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:41:42.529407 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.529297 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:41:42.529407 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:42.529361 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:14.529345061 +0000 UTC m=+66.228945913 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : secret "metrics-daemon-secret" not found Apr 23 17:41:42.629755 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.629715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:42.633154 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.633126 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trz4v\" (UniqueName: \"kubernetes.io/projected/7f6fcfee-1046-4e89-a5c4-e3d550d4056f-kube-api-access-trz4v\") pod \"network-check-target-h7vln\" (UID: \"7f6fcfee-1046-4e89-a5c4-e3d550d4056f\") " pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:42.819462 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:42.819416 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:43.077749 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:43.077722 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h7vln"] Apr 23 17:41:43.169427 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:41:43.169382 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6fcfee_1046_4e89_a5c4_e3d550d4056f.slice/crio-26efb7d9ba25a3abb5755b210993f927ec47c8b2b8d9235f4fc731bdc627205b WatchSource:0}: Error finding container 26efb7d9ba25a3abb5755b210993f927ec47c8b2b8d9235f4fc731bdc627205b: Status 404 returned error can't find the container with id 26efb7d9ba25a3abb5755b210993f927ec47c8b2b8d9235f4fc731bdc627205b Apr 23 17:41:43.436784 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:43.436749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:43.436956 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:43.436806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:43.436956 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:43.436894 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:43.436956 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:43.436897 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:43.437078 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:43.436963 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:45.436930325 +0000 UTC m=+37.136531175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:43.437078 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:43.436977 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:45.43697155 +0000 UTC m=+37.136572400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:41:44.125087 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:44.125043 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="ac40cfb4b6d485f7e650a3c2c897f5f2f5f778a137f7b40477e646fdf656dc3d" exitCode=0 Apr 23 17:41:44.125537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:44.125128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"ac40cfb4b6d485f7e650a3c2c897f5f2f5f778a137f7b40477e646fdf656dc3d"} Apr 23 17:41:44.126511 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:44.126488 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h7vln" event={"ID":"7f6fcfee-1046-4e89-a5c4-e3d550d4056f","Type":"ContainerStarted","Data":"26efb7d9ba25a3abb5755b210993f927ec47c8b2b8d9235f4fc731bdc627205b"} Apr 23 17:41:45.132108 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:45.131860 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b02fbde-34d7-498c-ba52-33a9307442e3" containerID="c269f31bb0885d7fdb48261db96bb3ae28533d2c2d9bf525b111d110b2bf21bb" exitCode=0 Apr 23 17:41:45.132535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:45.131952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerDied","Data":"c269f31bb0885d7fdb48261db96bb3ae28533d2c2d9bf525b111d110b2bf21bb"} Apr 23 17:41:45.452982 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:45.452868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:45.452982 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:45.452928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:45.453190 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:45.453036 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:45.453190 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:45.453119 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:49.453095294 +0000 UTC m=+41.152696167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:41:45.453190 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:45.453039 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:45.453346 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:45.453202 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:49.453182652 +0000 UTC m=+41.152783515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:46.136706 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:46.136675 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qnskt" event={"ID":"5b02fbde-34d7-498c-ba52-33a9307442e3","Type":"ContainerStarted","Data":"51b25a3dcbf76cacb3ebfc81f513b5f04ab003b66671d153f5cebf0e29f7b74b"} Apr 23 17:41:46.157178 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:46.157125 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qnskt" podStartSLOduration=6.451121816 podStartE2EDuration="38.157110421s" podCreationTimestamp="2026-04-23 17:41:08 +0000 UTC" firstStartedPulling="2026-04-23 17:41:11.491784095 +0000 UTC m=+3.191384949" lastFinishedPulling="2026-04-23 17:41:43.197772703 +0000 UTC m=+34.897373554" observedRunningTime="2026-04-23 17:41:46.155344282 +0000 UTC m=+37.854945156" watchObservedRunningTime="2026-04-23 17:41:46.157110421 +0000 UTC m=+37.856711283" Apr 23 17:41:47.139690 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:47.139653 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h7vln" event={"ID":"7f6fcfee-1046-4e89-a5c4-e3d550d4056f","Type":"ContainerStarted","Data":"eaedc244bc5168a9cc1d8b7be9d42e7e2ff8c8ffc18193f61aa267577ee6bb25"} Apr 23 17:41:47.140185 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:47.139858 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:41:47.155864 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:47.155815 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h7vln" podStartSLOduration=36.237699442 podStartE2EDuration="39.155804366s" podCreationTimestamp="2026-04-23 17:41:08 +0000 UTC" firstStartedPulling="2026-04-23 17:41:43.174550738 +0000 UTC m=+34.874151602" lastFinishedPulling="2026-04-23 17:41:46.092655649 +0000 UTC m=+37.792256526" observedRunningTime="2026-04-23 17:41:47.154717934 +0000 UTC m=+38.854318831" watchObservedRunningTime="2026-04-23 17:41:47.155804366 +0000 UTC m=+38.855405237" Apr 23 17:41:49.484639 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:49.484595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:49.484639 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:49.484640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:49.485207 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:49.484729 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:49.485207 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:49.484740 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:49.485207 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:49.484782 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.484767323 +0000 UTC m=+49.184368172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:41:49.485207 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:49.484806 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:41:57.484786725 +0000 UTC m=+49.184387579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:57.538628 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:57.538593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:41:57.539163 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:41:57.538660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:41:57.539163 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:57.538737 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:41:57.539163 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:57.538739 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:41:57.539163 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:57.538790 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:42:13.538776443 +0000 UTC m=+65.238377293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:41:57.539163 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:41:57.538807 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:13.538794234 +0000 UTC m=+65.238395084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:42:07.117804 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:07.117771 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zxwk2" Apr 23 17:42:13.544181 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:13.544136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:42:13.544181 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:13.544187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:42:13.544679 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:13.544291 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:13.544679 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:13.544296 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:13.544679 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:13.544371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:45.544353165 +0000 UTC m=+97.243954016 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:42:13.544679 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:13.544388 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:42:45.544381058 +0000 UTC m=+97.243981915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:42:14.550418 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:14.550371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:42:14.550825 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:14.550512 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:42:14.550825 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:14.550582 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:18.550563242 +0000 UTC m=+130.250164091 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : secret "metrics-daemon-secret" not found Apr 23 17:42:18.144222 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:18.144089 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h7vln" Apr 23 17:42:45.554044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:45.553994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:42:45.554044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:42:45.554049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:42:45.554613 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:45.554147 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:45.554613 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:45.554149 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:45.554613 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:45.554208 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls podName:d9288ab5-ccb4-416b-aa52-180278252652 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:49.554195547 +0000 UTC m=+161.253796397 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls") pod "dns-default-7trf4" (UID: "d9288ab5-ccb4-416b-aa52-180278252652") : secret "dns-default-metrics-tls" not found Apr 23 17:42:45.554613 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:42:45.554222 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert podName:abe92563-1ac2-4f25-b349-ffe1fce0022f nodeName:}" failed. No retries permitted until 2026-04-23 17:43:49.554216128 +0000 UTC m=+161.253816977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert") pod "ingress-canary-g4zr9" (UID: "abe92563-1ac2-4f25-b349-ffe1fce0022f") : secret "canary-serving-cert" not found Apr 23 17:43:10.029306 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.029269 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb"] Apr 23 17:43:10.032302 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.032279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.034629 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.034606 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 17:43:10.034742 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.034610 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 17:43:10.035209 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.035194 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 17:43:10.035292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.035209 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-c2w6f\"" Apr 23 17:43:10.035292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.035226 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:10.039885 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.039863 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb"] Apr 23 17:43:10.114118 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.114081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745573c0-66a3-4265-893e-2f9ebdaa8f3c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.114402 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.114380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745573c0-66a3-4265-893e-2f9ebdaa8f3c-config\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.114536 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.114520 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ns2\" (UniqueName: \"kubernetes.io/projected/745573c0-66a3-4265-893e-2f9ebdaa8f3c-kube-api-access-77ns2\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.140473 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.140443 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb"] Apr 23 17:43:10.143249 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.143234 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dsmq2"] Apr 23 17:43:10.143398 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.143379 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.145741 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.145724 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.149211 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.149188 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:43:10.149443 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.149412 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 17:43:10.149891 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.149717 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vdctg\"" Apr 23 17:43:10.150168 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.150149 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:43:10.152133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152116 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.152561 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bpk5j\"" Apr 23 17:43:10.152636 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152586 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:43:10.152685 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152588 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:43:10.152860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152846 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 17:43:10.152925 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.152851 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:43:10.153092 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.153074 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 17:43:10.154191 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.154177 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 17:43:10.159174 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.159146 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7hx6w\"" Apr 23 17:43:10.159278 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.159203 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:43:10.159278 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.159159 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:43:10.159607 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.159585 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:43:10.164708 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.164691 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:43:10.167039 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.167023 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 17:43:10.174210 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.174173 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb"] Apr 23 17:43:10.175157 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.175137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dsmq2"] Apr 23 17:43:10.182493 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.182473 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:43:10.215509 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745573c0-66a3-4265-893e-2f9ebdaa8f3c-config\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.215509 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77ns2\" (UniqueName: \"kubernetes.io/projected/745573c0-66a3-4265-893e-2f9ebdaa8f3c-kube-api-access-77ns2\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.215700 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.215700 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.215700 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc12dc89-ceae-445a-92dc-0ec601992482-serving-cert\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.215856 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-tmp\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.215856 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.215856 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.215856 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbzrn\" (UniqueName: \"kubernetes.io/projected/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-kube-api-access-vbzrn\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.215856 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-service-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745573c0-66a3-4265-893e-2f9ebdaa8f3c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.215986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-snapshots\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsms\" (UniqueName: \"kubernetes.io/projected/fc12dc89-ceae-445a-92dc-0ec601992482-kube-api-access-6vsms\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.216133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8rk\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.216431 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.216137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745573c0-66a3-4265-893e-2f9ebdaa8f3c-config\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.219652 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.219631 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745573c0-66a3-4265-893e-2f9ebdaa8f3c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.241386 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.241358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ns2\" (UniqueName: \"kubernetes.io/projected/745573c0-66a3-4265-893e-2f9ebdaa8f3c-kube-api-access-77ns2\") pod \"service-ca-operator-d6fc45fc5-659cb\" (UID: \"745573c0-66a3-4265-893e-2f9ebdaa8f3c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.317246 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-tmp\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.317246 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.317246 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.317246 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbzrn\" (UniqueName: \"kubernetes.io/projected/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-kube-api-access-vbzrn\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.317548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-service-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.317548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.317548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.317548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.317740 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.317552 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:10.317740 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.317572 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcfcf8949-44j22: secret "image-registry-tls" not found Apr 23 17:43:10.317740 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.317643 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls podName:ebd07caf-40e6-449a-9045-8ed993d63a23 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:10.817620473 +0000 UTC m=+122.517221325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls") pod "image-registry-7bcfcf8949-44j22" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23") : secret "image-registry-tls" not found Apr 23 17:43:10.317740 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-tmp\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.317740 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.317822 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.317869 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:10.817854864 +0000 UTC m=+122.517455727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-service-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-snapshots\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsms\" (UniqueName: \"kubernetes.io/projected/fc12dc89-ceae-445a-92dc-0ec601992482-kube-api-access-6vsms\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.317998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv8rk\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318143 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc12dc89-ceae-445a-92dc-0ec601992482-serving-cert\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.318791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.318791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc12dc89-ceae-445a-92dc-0ec601992482-snapshots\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.318791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.318723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.319695 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.319670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc12dc89-ceae-445a-92dc-0ec601992482-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.320306 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.320289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.320734 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.320713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.320805 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.320787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc12dc89-ceae-445a-92dc-0ec601992482-serving-cert\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.333602 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.333571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbzrn\" (UniqueName: \"kubernetes.io/projected/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-kube-api-access-vbzrn\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.333740 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.333653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.333740 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.333703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv8rk\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.334250 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.334232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsms\" (UniqueName: \"kubernetes.io/projected/fc12dc89-ceae-445a-92dc-0ec601992482-kube-api-access-6vsms\") pod \"insights-operator-585dfdc468-dsmq2\" (UID: \"fc12dc89-ceae-445a-92dc-0ec601992482\") " pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.341096 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.341079 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" Apr 23 17:43:10.358497 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.358470 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d79k5"] Apr 23 17:43:10.362831 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.362812 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.367247 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.367218 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 17:43:10.367368 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.367256 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:10.367368 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.367342 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 17:43:10.367479 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.367228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ffntw\"" Apr 23 17:43:10.368122 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.368019 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 17:43:10.372371 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.372353 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 17:43:10.376904 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.376884 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d79k5"] Apr 23 17:43:10.419094 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.419011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-config\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.419094 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.419071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-trusted-ca\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.419276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.419191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmj9\" (UniqueName: \"kubernetes.io/projected/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-kube-api-access-vzmj9\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.419276 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.419250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-serving-cert\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.459506 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.459473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" Apr 23 17:43:10.460825 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.460801 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb"] Apr 23 17:43:10.464509 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:10.464486 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745573c0_66a3_4265_893e_2f9ebdaa8f3c.slice/crio-db502cae0a7dd1c36857bdc71d9218e29428d39079011a2226747696419689fb WatchSource:0}: Error finding container db502cae0a7dd1c36857bdc71d9218e29428d39079011a2226747696419689fb: Status 404 returned error can't find the container with id db502cae0a7dd1c36857bdc71d9218e29428d39079011a2226747696419689fb Apr 23 17:43:10.520386 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.520352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-config\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.520507 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.520434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-trusted-ca\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.520507 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.520486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmj9\" (UniqueName: \"kubernetes.io/projected/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-kube-api-access-vzmj9\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.520626 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.520533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-serving-cert\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.521161 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.521102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-config\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.521384 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.521364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-trusted-ca\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.522828 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.522810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-serving-cert\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.529117 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.529095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmj9\" (UniqueName: \"kubernetes.io/projected/0e6eb10c-aea8-4862-8cc7-ecd18b5f6498-kube-api-access-vzmj9\") pod \"console-operator-9d4b6777b-d79k5\" (UID: \"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498\") " pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.581054 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.580978 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dsmq2"] Apr 23 17:43:10.583757 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:10.583726 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc12dc89_ceae_445a_92dc_0ec601992482.slice/crio-afcbaf12b71ab1fa6bc6f40beffb79044c4c8d4e4debb9bf1ab7a69a45846031 WatchSource:0}: Error finding container afcbaf12b71ab1fa6bc6f40beffb79044c4c8d4e4debb9bf1ab7a69a45846031: Status 404 returned error can't find the container with id afcbaf12b71ab1fa6bc6f40beffb79044c4c8d4e4debb9bf1ab7a69a45846031 Apr 23 17:43:10.678304 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.678267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:10.792693 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.792661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-d79k5"] Apr 23 17:43:10.797162 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:10.797133 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e6eb10c_aea8_4862_8cc7_ecd18b5f6498.slice/crio-aa5ab680a26a69426712e1bada14de4e0fe519d245eaa68f58b115e397bec668 WatchSource:0}: Error finding container aa5ab680a26a69426712e1bada14de4e0fe519d245eaa68f58b115e397bec668: Status 404 returned error can't find the container with id aa5ab680a26a69426712e1bada14de4e0fe519d245eaa68f58b115e397bec668 Apr 23 17:43:10.822553 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.822523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:10.822656 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:10.822588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:10.822693 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.822671 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:10.822693 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.822687 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:10.822755 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.822697 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcfcf8949-44j22: secret "image-registry-tls" not found Apr 23 17:43:10.822755 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.822736 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:11.822720127 +0000 UTC m=+123.522320977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:10.822755 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:10.822752 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls podName:ebd07caf-40e6-449a-9045-8ed993d63a23 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:11.822743971 +0000 UTC m=+123.522344820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls") pod "image-registry-7bcfcf8949-44j22" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23") : secret "image-registry-tls" not found Apr 23 17:43:11.298396 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:11.298313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" event={"ID":"fc12dc89-ceae-445a-92dc-0ec601992482","Type":"ContainerStarted","Data":"afcbaf12b71ab1fa6bc6f40beffb79044c4c8d4e4debb9bf1ab7a69a45846031"} Apr 23 17:43:11.300045 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:11.299983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" event={"ID":"745573c0-66a3-4265-893e-2f9ebdaa8f3c","Type":"ContainerStarted","Data":"db502cae0a7dd1c36857bdc71d9218e29428d39079011a2226747696419689fb"} Apr 23 17:43:11.301688 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:11.301657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" event={"ID":"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498","Type":"ContainerStarted","Data":"aa5ab680a26a69426712e1bada14de4e0fe519d245eaa68f58b115e397bec668"} Apr 23 17:43:11.831580 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:11.831539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:11.831818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:11.831643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:11.831818 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:11.831758 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:11.831818 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:11.831798 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:11.831818 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:11.831815 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcfcf8949-44j22: secret "image-registry-tls" not found Apr 23 17:43:11.832055 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:11.831841 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:13.83181977 +0000 UTC m=+125.531420623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:11.832055 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:11.831867 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls podName:ebd07caf-40e6-449a-9045-8ed993d63a23 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:13.831849111 +0000 UTC m=+125.531449970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls") pod "image-registry-7bcfcf8949-44j22" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23") : secret "image-registry-tls" not found Apr 23 17:43:13.845807 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.845768 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m"] Apr 23 17:43:13.851036 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.850998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:13.851178 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.851102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:13.851252 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:13.851186 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:13.851252 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:13.851227 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:13.851252 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:13.851240 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcfcf8949-44j22: secret "image-registry-tls" not found Apr 23 17:43:13.851382 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:13.851275 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:17.85125477 +0000 UTC m=+129.550855626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:13.851382 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:13.851299 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls podName:ebd07caf-40e6-449a-9045-8ed993d63a23 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:17.851283555 +0000 UTC m=+129.550884411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls") pod "image-registry-7bcfcf8949-44j22" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23") : secret "image-registry-tls" not found Apr 23 17:43:13.851548 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.851531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" Apr 23 17:43:13.853905 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.853886 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 17:43:13.854025 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.853906 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:13.854433 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.854418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5wwlv\"" Apr 23 17:43:13.858016 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.857989 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m"] Apr 23 17:43:13.951911 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:13.951873 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkmj\" (UniqueName: \"kubernetes.io/projected/1cdd9b61-36a1-469f-97dc-02d5fadd2316-kube-api-access-qxkmj\") pod \"migrator-74bb7799d9-4h56m\" (UID: \"1cdd9b61-36a1-469f-97dc-02d5fadd2316\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" Apr 23 17:43:14.052860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.052816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkmj\" (UniqueName: \"kubernetes.io/projected/1cdd9b61-36a1-469f-97dc-02d5fadd2316-kube-api-access-qxkmj\") pod \"migrator-74bb7799d9-4h56m\" (UID: \"1cdd9b61-36a1-469f-97dc-02d5fadd2316\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" Apr 23 17:43:14.062342 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.062306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkmj\" (UniqueName: \"kubernetes.io/projected/1cdd9b61-36a1-469f-97dc-02d5fadd2316-kube-api-access-qxkmj\") pod \"migrator-74bb7799d9-4h56m\" (UID: \"1cdd9b61-36a1-469f-97dc-02d5fadd2316\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" Apr 23 17:43:14.161851 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.161761 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" Apr 23 17:43:14.289367 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.289334 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m"] Apr 23 17:43:14.292451 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:14.292421 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cdd9b61_36a1_469f_97dc_02d5fadd2316.slice/crio-87887620f664640eabc8976853c59717f6dfda991fe47c380899e182dec207f4 WatchSource:0}: Error finding container 87887620f664640eabc8976853c59717f6dfda991fe47c380899e182dec207f4: Status 404 returned error can't find the container with id 87887620f664640eabc8976853c59717f6dfda991fe47c380899e182dec207f4 Apr 23 17:43:14.309189 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.309152 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" event={"ID":"fc12dc89-ceae-445a-92dc-0ec601992482","Type":"ContainerStarted","Data":"8bafc571aa8c60bf93f5f70ca1691b929d04f59c7611b0967a89d85316ab3645"} Apr 23 17:43:14.310555 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.310530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" event={"ID":"745573c0-66a3-4265-893e-2f9ebdaa8f3c","Type":"ContainerStarted","Data":"20776d87ad35e1ddaae619ab589e1c4bb22279bba4a102442a586b7b5b3ec95b"} Apr 23 17:43:14.311560 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.311542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" event={"ID":"1cdd9b61-36a1-469f-97dc-02d5fadd2316","Type":"ContainerStarted","Data":"87887620f664640eabc8976853c59717f6dfda991fe47c380899e182dec207f4"} Apr 23 17:43:14.313045 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.313028 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/0.log" Apr 23 17:43:14.313124 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.313067 2579 generic.go:358] "Generic (PLEG): container finished" podID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" containerID="0e38f8eec87cfb24af7cf448fc1429b73bfe1fd65f2ef3c07382be303f8e6de0" exitCode=255 Apr 23 17:43:14.313124 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.313115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" event={"ID":"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498","Type":"ContainerDied","Data":"0e38f8eec87cfb24af7cf448fc1429b73bfe1fd65f2ef3c07382be303f8e6de0"} Apr 23 17:43:14.313329 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.313314 2579 scope.go:117] "RemoveContainer" containerID="0e38f8eec87cfb24af7cf448fc1429b73bfe1fd65f2ef3c07382be303f8e6de0" Apr 23 17:43:14.336481 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.336441 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" podStartSLOduration=1.631959639 podStartE2EDuration="4.336427183s" podCreationTimestamp="2026-04-23 17:43:10 +0000 UTC" firstStartedPulling="2026-04-23 17:43:10.585507907 +0000 UTC m=+122.285108758" lastFinishedPulling="2026-04-23 17:43:13.289975446 +0000 UTC m=+124.989576302" observedRunningTime="2026-04-23 17:43:14.336220675 +0000 UTC m=+126.035821547" watchObservedRunningTime="2026-04-23 17:43:14.336427183 +0000 UTC m=+126.036028054" Apr 23 17:43:14.360235 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:14.360182 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" podStartSLOduration=1.538854699 podStartE2EDuration="4.360164089s" podCreationTimestamp="2026-04-23 17:43:10 +0000 UTC" firstStartedPulling="2026-04-23 17:43:10.466476634 +0000 UTC m=+122.166077484" lastFinishedPulling="2026-04-23 17:43:13.287786024 +0000 UTC m=+124.987386874" observedRunningTime="2026-04-23 17:43:14.357858877 +0000 UTC m=+126.057459750" watchObservedRunningTime="2026-04-23 17:43:14.360164089 +0000 UTC m=+126.059764959" Apr 23 17:43:15.316514 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.316488 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/1.log" Apr 23 17:43:15.316914 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.316853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/0.log" Apr 23 17:43:15.316914 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.316886 2579 generic.go:358] "Generic (PLEG): container finished" podID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" exitCode=255 Apr 23 17:43:15.317034 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.316991 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" event={"ID":"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498","Type":"ContainerDied","Data":"fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00"} Apr 23 17:43:15.317083 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.317040 2579 scope.go:117] "RemoveContainer" containerID="0e38f8eec87cfb24af7cf448fc1429b73bfe1fd65f2ef3c07382be303f8e6de0" Apr 23 17:43:15.317226 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.317192 2579 scope.go:117] "RemoveContainer" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" Apr 23 17:43:15.317408 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:15.317390 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:15.318720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.318693 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" event={"ID":"1cdd9b61-36a1-469f-97dc-02d5fadd2316","Type":"ContainerStarted","Data":"c6b701bbfdd242edb4242c5983e9a4a8752e0273be3c168371f45ee24eba217f"} Apr 23 17:43:15.318814 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.318729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" event={"ID":"1cdd9b61-36a1-469f-97dc-02d5fadd2316","Type":"ContainerStarted","Data":"430c1d22f9999b33f4ddfe4ad948fc7b9f1aceee8743b6630ca02098ffea7861"} Apr 23 17:43:15.368263 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:15.368216 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4h56m" podStartSLOduration=1.491692633 podStartE2EDuration="2.368200573s" podCreationTimestamp="2026-04-23 17:43:13 +0000 UTC" firstStartedPulling="2026-04-23 17:43:14.294364633 +0000 UTC m=+125.993965486" lastFinishedPulling="2026-04-23 17:43:15.170872574 +0000 UTC m=+126.870473426" observedRunningTime="2026-04-23 17:43:15.3642154 +0000 UTC m=+127.063816341" watchObservedRunningTime="2026-04-23 17:43:15.368200573 +0000 UTC m=+127.067801445" Apr 23 17:43:16.322786 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:16.322754 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/1.log" Apr 23 17:43:16.323184 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:16.323163 2579 scope.go:117] "RemoveContainer" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" Apr 23 17:43:16.323391 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:16.323370 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:17.544855 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:17.544825 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-thrhm_994fa59f-956e-4d6b-8074-e3d2459771d9/dns-node-resolver/0.log" Apr 23 17:43:17.891537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:17.891444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:17.891537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:17.891517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:17.891723 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:17.891596 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:17.891723 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:17.891599 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:17.891723 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:17.891618 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcfcf8949-44j22: secret "image-registry-tls" not found Apr 23 17:43:17.891723 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:17.891651 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:25.891635747 +0000 UTC m=+137.591236596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:17.891723 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:17.891665 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls podName:ebd07caf-40e6-449a-9045-8ed993d63a23 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:25.89165834 +0000 UTC m=+137.591259189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls") pod "image-registry-7bcfcf8949-44j22" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23") : secret "image-registry-tls" not found Apr 23 17:43:18.344714 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:18.344685 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jldlx_672ee16e-ccf3-47b3-a727-a91e0e7a9fbc/node-ca/0.log" Apr 23 17:43:18.597536 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:18.597453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:43:18.597881 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:18.597601 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:43:18.597881 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:18.597665 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs podName:51d0c740-3b6f-4927-90d0-03577afcf352 nodeName:}" failed. No retries permitted until 2026-04-23 17:45:20.597649817 +0000 UTC m=+252.297250667 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs") pod "network-metrics-daemon-45qq7" (UID: "51d0c740-3b6f-4927-90d0-03577afcf352") : secret "metrics-daemon-secret" not found Apr 23 17:43:19.554070 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:19.554039 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4h56m_1cdd9b61-36a1-469f-97dc-02d5fadd2316/migrator/0.log" Apr 23 17:43:19.753353 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:19.753320 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4h56m_1cdd9b61-36a1-469f-97dc-02d5fadd2316/graceful-termination/0.log" Apr 23 17:43:20.679086 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:20.679053 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:20.679086 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:20.679086 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:20.679438 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:20.679425 2579 scope.go:117] "RemoveContainer" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" Apr 23 17:43:20.679596 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:20.679580 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:25.954474 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:25.954429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:25.954913 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:25.954526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:25.954913 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:25.954659 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:25.955192 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:25.955163 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls podName:a3ef0c5c-c346-4ab2-8f0b-127c98996cb5 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:41.955137755 +0000 UTC m=+153.654738609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8hmmb" (UID: "a3ef0c5c-c346-4ab2-8f0b-127c98996cb5") : secret "cluster-monitoring-operator-tls" not found Apr 23 17:43:25.959318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:25.959295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"image-registry-7bcfcf8949-44j22\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:26.066025 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.065990 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:26.204545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.204500 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:43:26.206405 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:26.206380 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebd07caf_40e6_449a_9045_8ed993d63a23.slice/crio-cfef84eab84149abc055c5be4d7e557843b5bc530b97e3e08a4793ee4a32d7ce WatchSource:0}: Error finding container cfef84eab84149abc055c5be4d7e557843b5bc530b97e3e08a4793ee4a32d7ce: Status 404 returned error can't find the container with id cfef84eab84149abc055c5be4d7e557843b5bc530b97e3e08a4793ee4a32d7ce Apr 23 17:43:26.347904 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.347866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerStarted","Data":"89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d"} Apr 23 17:43:26.348105 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.347912 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerStarted","Data":"cfef84eab84149abc055c5be4d7e557843b5bc530b97e3e08a4793ee4a32d7ce"} Apr 23 17:43:26.348105 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.347966 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:43:26.373132 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:26.373085 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podStartSLOduration=16.373071011 podStartE2EDuration="16.373071011s" podCreationTimestamp="2026-04-23 17:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:43:26.37241357 +0000 UTC m=+138.072014443" watchObservedRunningTime="2026-04-23 17:43:26.373071011 +0000 UTC m=+138.072671920" Apr 23 17:43:31.902306 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:31.902276 2579 scope.go:117] "RemoveContainer" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" Apr 23 17:43:32.363922 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.363898 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:43:32.364293 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.364276 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/1.log" Apr 23 17:43:32.364349 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.364315 2579 generic.go:358] "Generic (PLEG): container finished" podID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" containerID="b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71" exitCode=255 Apr 23 17:43:32.364385 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.364347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" event={"ID":"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498","Type":"ContainerDied","Data":"b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71"} Apr 23 17:43:32.364385 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.364376 2579 scope.go:117] "RemoveContainer" containerID="fd89143667331c231676c37a8afd5e22b27bf2f49ed5cae4d50660b6af20ce00" Apr 23 17:43:32.364720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:32.364700 2579 scope.go:117] "RemoveContainer" containerID="b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71" Apr 23 17:43:32.364920 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:32.364894 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:33.368113 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:33.368083 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:43:40.678573 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:40.678537 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:40.678573 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:40.678576 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:43:40.678990 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:40.678967 2579 scope.go:117] "RemoveContainer" containerID="b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71" Apr 23 17:43:40.679187 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:40.679168 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:41.984890 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:41.984852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:41.987501 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:41.987477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3ef0c5c-c346-4ab2-8f0b-127c98996cb5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8hmmb\" (UID: \"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:42.252865 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:42.252778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" Apr 23 17:43:42.378346 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:42.378310 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb"] Apr 23 17:43:42.381500 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:42.381470 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ef0c5c_c346_4ab2_8f0b_127c98996cb5.slice/crio-c520c2f7328a722fd0f6e91abd320516db4304d3e392a4d33f62a4bcbe06f079 WatchSource:0}: Error finding container c520c2f7328a722fd0f6e91abd320516db4304d3e392a4d33f62a4bcbe06f079: Status 404 returned error can't find the container with id c520c2f7328a722fd0f6e91abd320516db4304d3e392a4d33f62a4bcbe06f079 Apr 23 17:43:42.389609 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:42.389583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" event={"ID":"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5","Type":"ContainerStarted","Data":"c520c2f7328a722fd0f6e91abd320516db4304d3e392a4d33f62a4bcbe06f079"} Apr 23 17:43:44.395975 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:44.395922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" event={"ID":"a3ef0c5c-c346-4ab2-8f0b-127c98996cb5","Type":"ContainerStarted","Data":"93e31dcdf4769391cf87c918ce764d70fccfde7b8adb6a2fc2b57a96d3077caa"} Apr 23 17:43:44.413017 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:44.412970 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8hmmb" podStartSLOduration=32.928683792 podStartE2EDuration="34.412928358s" podCreationTimestamp="2026-04-23 17:43:10 +0000 UTC" firstStartedPulling="2026-04-23 17:43:42.383709404 +0000 UTC m=+154.083310254" lastFinishedPulling="2026-04-23 17:43:43.86795396 +0000 UTC m=+155.567554820" observedRunningTime="2026-04-23 17:43:44.41201906 +0000 UTC m=+156.111619927" watchObservedRunningTime="2026-04-23 17:43:44.412928358 +0000 UTC m=+156.112529230" Apr 23 17:43:44.674758 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:44.674658 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7trf4" podUID="d9288ab5-ccb4-416b-aa52-180278252652" Apr 23 17:43:44.679841 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:44.679813 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-g4zr9" podUID="abe92563-1ac2-4f25-b349-ffe1fce0022f" Apr 23 17:43:44.914324 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:44.914288 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-45qq7" podUID="51d0c740-3b6f-4927-90d0-03577afcf352" Apr 23 17:43:45.398031 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:45.397996 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:43:45.398466 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:45.398081 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7trf4" Apr 23 17:43:46.070305 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:46.070270 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:43:46.070495 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:46.070341 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:43:47.354924 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:47.354840 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:43:47.354924 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:47.354909 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:43:49.637985 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.637927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:43:49.637985 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.637986 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:43:49.640511 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.640481 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9288ab5-ccb4-416b-aa52-180278252652-metrics-tls\") pod \"dns-default-7trf4\" (UID: \"d9288ab5-ccb4-416b-aa52-180278252652\") " pod="openshift-dns/dns-default-7trf4" Apr 23 17:43:49.640651 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.640526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe92563-1ac2-4f25-b349-ffe1fce0022f-cert\") pod \"ingress-canary-g4zr9\" (UID: \"abe92563-1ac2-4f25-b349-ffe1fce0022f\") " pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:43:49.902283 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.902199 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b6vkm\"" Apr 23 17:43:49.902283 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.902221 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bmxrf\"" Apr 23 17:43:49.910008 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.909980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4zr9" Apr 23 17:43:49.910129 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:49.910009 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7trf4" Apr 23 17:43:50.040285 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:50.040206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7trf4"] Apr 23 17:43:50.042987 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:50.042959 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9288ab5_ccb4_416b_aa52_180278252652.slice/crio-4494d47eca6b94fe4e7c4e71ec7ef3309d07c3657325a6421e9d8ed8b7861bd6 WatchSource:0}: Error finding container 4494d47eca6b94fe4e7c4e71ec7ef3309d07c3657325a6421e9d8ed8b7861bd6: Status 404 returned error can't find the container with id 4494d47eca6b94fe4e7c4e71ec7ef3309d07c3657325a6421e9d8ed8b7861bd6 Apr 23 17:43:50.059829 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:50.059798 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4zr9"] Apr 23 17:43:50.062475 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:43:50.062449 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe92563_1ac2_4f25_b349_ffe1fce0022f.slice/crio-09335eb65e908335fbfd4ed5aa59cebed616597d2b6397d0d9430c15ec5134ad WatchSource:0}: Error finding container 09335eb65e908335fbfd4ed5aa59cebed616597d2b6397d0d9430c15ec5134ad: Status 404 returned error can't find the container with id 09335eb65e908335fbfd4ed5aa59cebed616597d2b6397d0d9430c15ec5134ad Apr 23 17:43:50.419040 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:50.418986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4zr9" event={"ID":"abe92563-1ac2-4f25-b349-ffe1fce0022f","Type":"ContainerStarted","Data":"09335eb65e908335fbfd4ed5aa59cebed616597d2b6397d0d9430c15ec5134ad"} Apr 23 17:43:50.420132 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:50.420096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7trf4" event={"ID":"d9288ab5-ccb4-416b-aa52-180278252652","Type":"ContainerStarted","Data":"4494d47eca6b94fe4e7c4e71ec7ef3309d07c3657325a6421e9d8ed8b7861bd6"} Apr 23 17:43:51.902895 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:51.902230 2579 scope.go:117] "RemoveContainer" containerID="b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71" Apr 23 17:43:51.902895 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:43:51.902439 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-d79k5_openshift-console-operator(0e6eb10c-aea8-4862-8cc7-ecd18b5f6498)\"" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podUID="0e6eb10c-aea8-4862-8cc7-ecd18b5f6498" Apr 23 17:43:52.427413 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.427375 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4zr9" event={"ID":"abe92563-1ac2-4f25-b349-ffe1fce0022f","Type":"ContainerStarted","Data":"8a06845f5f46e39968f19ca8b49d474bd1aa3d16edbadd287febff9dae3dc45f"} Apr 23 17:43:52.428843 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.428821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7trf4" event={"ID":"d9288ab5-ccb4-416b-aa52-180278252652","Type":"ContainerStarted","Data":"c34e69baf4d06ea5e91f9b878046c5b4812fb0fdc94575c00d7ffe149ce06203"} Apr 23 17:43:52.428969 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.428847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7trf4" event={"ID":"d9288ab5-ccb4-416b-aa52-180278252652","Type":"ContainerStarted","Data":"a3c894d33960057d52bc5a6408534f250ce9e04dd3e0039b0b07bd3c4fbf43ad"} Apr 23 17:43:52.429023 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.428981 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7trf4" Apr 23 17:43:52.444420 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.444365 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g4zr9" podStartSLOduration=129.737560151 podStartE2EDuration="2m11.444349273s" podCreationTimestamp="2026-04-23 17:41:41 +0000 UTC" firstStartedPulling="2026-04-23 17:43:50.064099146 +0000 UTC m=+161.763699995" lastFinishedPulling="2026-04-23 17:43:51.770888257 +0000 UTC m=+163.470489117" observedRunningTime="2026-04-23 17:43:52.443594893 +0000 UTC m=+164.143195766" watchObservedRunningTime="2026-04-23 17:43:52.444349273 +0000 UTC m=+164.143950144" Apr 23 17:43:52.461996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:52.461954 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7trf4" podStartSLOduration=129.739466305 podStartE2EDuration="2m11.461920094s" podCreationTimestamp="2026-04-23 17:41:41 +0000 UTC" firstStartedPulling="2026-04-23 17:43:50.044999303 +0000 UTC m=+161.744600154" lastFinishedPulling="2026-04-23 17:43:51.767453078 +0000 UTC m=+163.467053943" observedRunningTime="2026-04-23 17:43:52.461316145 +0000 UTC m=+164.160917011" watchObservedRunningTime="2026-04-23 17:43:52.461920094 +0000 UTC m=+164.161520965" Apr 23 17:43:56.070632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:56.070596 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:43:56.071018 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:56.070646 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:43:57.354823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:57.354784 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:43:57.355316 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:57.354833 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:43:59.902003 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:43:59.901929 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:44:02.433312 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:02.433284 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7trf4" Apr 23 17:44:05.902757 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:05.902723 2579 scope.go:117] "RemoveContainer" containerID="b65b254a82ca855e79fb809f8414d3e7deb184cf5d423bd3e381e962ea138e71" Apr 23 17:44:06.070198 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.070162 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:06.070374 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.070226 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:06.070374 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.070265 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:44:06.070736 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.070698 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d"} pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:44:06.074247 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.074221 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:06.074367 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.074260 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:06.470864 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.470833 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:44:06.471046 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.470955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" event={"ID":"0e6eb10c-aea8-4862-8cc7-ecd18b5f6498","Type":"ContainerStarted","Data":"a25a5452c16b8be63b7c78c8189c520edaf7e9c0346a9a9c3f996f27eb0fed73"} Apr 23 17:44:06.471306 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.471283 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:44:06.494515 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.494463 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" podStartSLOduration=53.999306071 podStartE2EDuration="56.494449801s" podCreationTimestamp="2026-04-23 17:43:10 +0000 UTC" firstStartedPulling="2026-04-23 17:43:10.798900887 +0000 UTC m=+122.498501736" lastFinishedPulling="2026-04-23 17:43:13.294044615 +0000 UTC m=+124.993645466" observedRunningTime="2026-04-23 17:44:06.494218073 +0000 UTC m=+178.193818945" watchObservedRunningTime="2026-04-23 17:44:06.494449801 +0000 UTC m=+178.194050672" Apr 23 17:44:06.651125 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:06.651097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-d79k5" Apr 23 17:44:16.074770 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:16.074730 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:16.075169 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:16.074784 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:24.518795 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:24.518760 2579 generic.go:358] "Generic (PLEG): container finished" podID="745573c0-66a3-4265-893e-2f9ebdaa8f3c" containerID="20776d87ad35e1ddaae619ab589e1c4bb22279bba4a102442a586b7b5b3ec95b" exitCode=0 Apr 23 17:44:24.519192 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:24.518804 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" event={"ID":"745573c0-66a3-4265-893e-2f9ebdaa8f3c","Type":"ContainerDied","Data":"20776d87ad35e1ddaae619ab589e1c4bb22279bba4a102442a586b7b5b3ec95b"} Apr 23 17:44:24.519192 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:24.519106 2579 scope.go:117] "RemoveContainer" containerID="20776d87ad35e1ddaae619ab589e1c4bb22279bba4a102442a586b7b5b3ec95b" Apr 23 17:44:25.524301 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:25.524260 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-659cb" event={"ID":"745573c0-66a3-4265-893e-2f9ebdaa8f3c","Type":"ContainerStarted","Data":"c8574ce629dee69625cc0ee8fe4c60684a23b75d089dcced2ab22063a46256c8"} Apr 23 17:44:26.075279 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:26.075248 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:26.075472 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:26.075304 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:31.089835 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:31.089786 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" containerID="cri-o://89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d" gracePeriod=30 Apr 23 17:44:31.542709 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:31.542669 2579 generic.go:358] "Generic (PLEG): container finished" podID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerID="89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d" exitCode=0 Apr 23 17:44:31.542898 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:31.542741 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerDied","Data":"89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d"} Apr 23 17:44:31.542898 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:31.542784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerStarted","Data":"453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b"} Apr 23 17:44:31.543001 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:31.542973 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:44:39.563736 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:39.563705 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc12dc89-ceae-445a-92dc-0ec601992482" containerID="8bafc571aa8c60bf93f5f70ca1691b929d04f59c7611b0967a89d85316ab3645" exitCode=0 Apr 23 17:44:39.564162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:39.563774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" event={"ID":"fc12dc89-ceae-445a-92dc-0ec601992482","Type":"ContainerDied","Data":"8bafc571aa8c60bf93f5f70ca1691b929d04f59c7611b0967a89d85316ab3645"} Apr 23 17:44:39.564162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:39.564080 2579 scope.go:117] "RemoveContainer" containerID="8bafc571aa8c60bf93f5f70ca1691b929d04f59c7611b0967a89d85316ab3645" Apr 23 17:44:40.568342 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:40.568308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dsmq2" event={"ID":"fc12dc89-ceae-445a-92dc-0ec601992482","Type":"ContainerStarted","Data":"98b4f28275c7ba26a7d2af10b4950cabb25b0527caae69b9e288c4f7ab7c4054"} Apr 23 17:44:40.865131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:40.865043 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7trf4_d9288ab5-ccb4-416b-aa52-180278252652/dns/0.log" Apr 23 17:44:41.065761 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:41.065730 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7trf4_d9288ab5-ccb4-416b-aa52-180278252652/kube-rbac-proxy/0.log" Apr 23 17:44:42.465681 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:42.465655 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-thrhm_994fa59f-956e-4d6b-8074-e3d2459771d9/dns-node-resolver/0.log" Apr 23 17:44:42.866782 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:42.866756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bcfcf8949-44j22_ebd07caf-40e6-449a-9045-8ed993d63a23/registry/0.log" Apr 23 17:44:43.066132 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:43.066105 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bcfcf8949-44j22_ebd07caf-40e6-449a-9045-8ed993d63a23/registry/1.log" Apr 23 17:44:43.466067 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:43.466033 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jldlx_672ee16e-ccf3-47b3-a727-a91e0e7a9fbc/node-ca/0.log" Apr 23 17:44:44.265798 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:44.265772 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-g4zr9_abe92563-1ac2-4f25-b349-ffe1fce0022f/serve-healthcheck-canary/0.log" Apr 23 17:44:46.070482 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:46.070449 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:46.070860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:46.070501 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:52.549558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:52.549520 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:52.549979 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:52.549574 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:56.070203 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:56.070157 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:56.070626 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:44:56.070224 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:02.549334 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:02.549296 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:02.549787 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:02.549358 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:06.070332 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.070299 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:06.070731 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.070350 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:06.070731 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.070389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:45:06.070848 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.070821 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b"} pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:45:06.074237 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.074204 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:06.074340 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:06.074256 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:16.074822 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:16.074783 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:16.075242 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:16.074838 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:20.603161 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:20.603060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:45:20.605521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:20.605501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51d0c740-3b6f-4927-90d0-03577afcf352-metrics-certs\") pod \"network-metrics-daemon-45qq7\" (UID: \"51d0c740-3b6f-4927-90d0-03577afcf352\") " pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:45:20.905930 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:20.905853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qk5s8\"" Apr 23 17:45:20.913943 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:20.913923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-45qq7" Apr 23 17:45:21.036704 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:21.036668 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-45qq7"] Apr 23 17:45:21.039844 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:45:21.039814 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51d0c740_3b6f_4927_90d0_03577afcf352.slice/crio-6b674100bf5438c3821d224bef89f1c0ff3d685382852d3c98702e56a0de8f46 WatchSource:0}: Error finding container 6b674100bf5438c3821d224bef89f1c0ff3d685382852d3c98702e56a0de8f46: Status 404 returned error can't find the container with id 6b674100bf5438c3821d224bef89f1c0ff3d685382852d3c98702e56a0de8f46 Apr 23 17:45:21.669015 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:21.668977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-45qq7" event={"ID":"51d0c740-3b6f-4927-90d0-03577afcf352","Type":"ContainerStarted","Data":"6b674100bf5438c3821d224bef89f1c0ff3d685382852d3c98702e56a0de8f46"} Apr 23 17:45:22.675031 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:22.674909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-45qq7" event={"ID":"51d0c740-3b6f-4927-90d0-03577afcf352","Type":"ContainerStarted","Data":"a68a3e579948ffc228f4dfd86d1191238a58e83458d24f10ecd7e8b78b978b4d"} Apr 23 17:45:22.675031 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:22.674962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-45qq7" event={"ID":"51d0c740-3b6f-4927-90d0-03577afcf352","Type":"ContainerStarted","Data":"02df0c6184c48c129b5e001ca1a439b4fe6f69c94ce3f9bd0c6ebbb479ea7652"} Apr 23 17:45:22.695140 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:22.695092 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-45qq7" podStartSLOduration=253.815605014 podStartE2EDuration="4m14.695077001s" podCreationTimestamp="2026-04-23 17:41:08 +0000 UTC" firstStartedPulling="2026-04-23 17:45:21.04162898 +0000 UTC m=+252.741229834" lastFinishedPulling="2026-04-23 17:45:21.921100961 +0000 UTC m=+253.620701821" observedRunningTime="2026-04-23 17:45:22.693280596 +0000 UTC m=+254.392881469" watchObservedRunningTime="2026-04-23 17:45:22.695077001 +0000 UTC m=+254.394677873" Apr 23 17:45:26.074983 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:26.074922 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:26.075376 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:26.075001 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:31.089277 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.089233 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" containerID="cri-o://453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b" gracePeriod=30 Apr 23 17:45:31.700742 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.700707 2579 generic.go:358] "Generic (PLEG): container finished" podID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerID="453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b" exitCode=0 Apr 23 17:45:31.700954 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.700778 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerDied","Data":"453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b"} Apr 23 17:45:31.700954 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.700804 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerStarted","Data":"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df"} Apr 23 17:45:31.700954 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.700816 2579 scope.go:117] "RemoveContainer" containerID="89e21b9c8074c55c24b90fc2b7bf5d4c8a7407293af72de2a4381a0356aa8b9d" Apr 23 17:45:31.701132 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:31.701079 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:45:46.071987 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:46.071931 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:46.072402 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:46.072016 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:52.709027 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:52.708994 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:52.709390 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:52.709053 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:56.069885 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:56.069852 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:56.070358 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:45:56.069910 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:02.709471 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:02.709425 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:02.709845 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:02.709493 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:06.070627 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.070594 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:06.071056 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.070653 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:06.071056 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.070692 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:46:06.071165 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.071140 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df"} pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:46:06.074346 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.074322 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:06.074455 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:06.074364 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:08.779522 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:08.779490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:46:08.779986 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:08.779759 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:46:08.782534 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:08.782511 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:46:08.782720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:08.782703 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:46:08.789258 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:08.789238 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:46:16.075335 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:16.075296 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:16.077680 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:16.075359 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:26.075220 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:26.075186 2579 patch_prober.go:28] interesting pod/image-registry-7bcfcf8949-44j22 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:26.075639 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:26.075245 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:31.089314 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.089268 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" containerID="cri-o://ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df" gracePeriod=30 Apr 23 17:46:31.201529 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.201510 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:46:31.853153 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.853123 2579 generic.go:358] "Generic (PLEG): container finished" podID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerID="ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df" exitCode=0 Apr 23 17:46:31.853321 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.853201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerDied","Data":"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df"} Apr 23 17:46:31.853321 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.853234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerStarted","Data":"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f"} Apr 23 17:46:31.853321 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.853252 2579 scope.go:117] "RemoveContainer" containerID="453f868938487bebe568b7e3628e77c306f17171bae68dea3266ec8eea45d28b" Apr 23 17:46:31.853453 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:31.853433 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:46:37.469854 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.469821 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg"] Apr 23 17:46:37.474952 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.474915 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" Apr 23 17:46:37.475115 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.475091 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-89vtf"] Apr 23 17:46:37.477344 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.477320 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nqsnr\"" Apr 23 17:46:37.478605 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.478583 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.480805 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.480783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dt5dv\"" Apr 23 17:46:37.480914 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.480832 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:46:37.481357 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.481343 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:46:37.486798 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.486779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg"] Apr 23 17:46:37.494675 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.494643 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:46:37.495803 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.495783 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89vtf"] Apr 23 17:46:37.526540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-data-volume\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.526540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589q8\" (UniqueName: \"kubernetes.io/projected/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-api-access-589q8\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.526708 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.526708 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgss2\" (UniqueName: \"kubernetes.io/projected/45adc12a-53d2-46dc-a15b-d354387909c2-kube-api-access-rgss2\") pod \"network-check-source-8894fc9bd-v5kgg\" (UID: \"45adc12a-53d2-46dc-a15b-d354387909c2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" Apr 23 17:46:37.526708 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526598 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-crio-socket\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.526708 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.526620 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.627965 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.627919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.627998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-data-volume\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-589q8\" (UniqueName: \"kubernetes.io/projected/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-api-access-589q8\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628249 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgss2\" (UniqueName: \"kubernetes.io/projected/45adc12a-53d2-46dc-a15b-d354387909c2-kube-api-access-rgss2\") pod \"network-check-source-8894fc9bd-v5kgg\" (UID: \"45adc12a-53d2-46dc-a15b-d354387909c2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" Apr 23 17:46:37.628249 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-crio-socket\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628346 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-crio-socket\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628400 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-data-volume\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.628554 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.628535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.630501 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.630479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.656861 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.656831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-589q8\" (UniqueName: \"kubernetes.io/projected/f28b3e97-3375-41b6-8e26-7f03bbe44a6c-kube-api-access-589q8\") pod \"insights-runtime-extractor-89vtf\" (UID: \"f28b3e97-3375-41b6-8e26-7f03bbe44a6c\") " pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.670283 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.670249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgss2\" (UniqueName: \"kubernetes.io/projected/45adc12a-53d2-46dc-a15b-d354387909c2-kube-api-access-rgss2\") pod \"network-check-source-8894fc9bd-v5kgg\" (UID: \"45adc12a-53d2-46dc-a15b-d354387909c2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" Apr 23 17:46:37.786214 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.786177 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" Apr 23 17:46:37.792662 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.792644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89vtf" Apr 23 17:46:37.933955 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.933907 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg"] Apr 23 17:46:37.937125 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:37.937098 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45adc12a_53d2_46dc_a15b_d354387909c2.slice/crio-bd843e5e445aa322ac9398c6b109235c890e4e6d5966c360995559ed506af3e8 WatchSource:0}: Error finding container bd843e5e445aa322ac9398c6b109235c890e4e6d5966c360995559ed506af3e8: Status 404 returned error can't find the container with id bd843e5e445aa322ac9398c6b109235c890e4e6d5966c360995559ed506af3e8 Apr 23 17:46:37.963836 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:37.963816 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89vtf"] Apr 23 17:46:37.970087 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:37.970062 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf28b3e97_3375_41b6_8e26_7f03bbe44a6c.slice/crio-aff52cd8941558fd744b9f2decb9a2347100dd64f28c2fcd74d17fb618a14e49 WatchSource:0}: Error finding container aff52cd8941558fd744b9f2decb9a2347100dd64f28c2fcd74d17fb618a14e49: Status 404 returned error can't find the container with id aff52cd8941558fd744b9f2decb9a2347100dd64f28c2fcd74d17fb618a14e49 Apr 23 17:46:38.873720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.873626 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" event={"ID":"45adc12a-53d2-46dc-a15b-d354387909c2","Type":"ContainerStarted","Data":"591ad9eb781fd39130bc0f838ea37c9c9bc6a123d58744902957b2307c475a98"} Apr 23 17:46:38.873720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.873661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" event={"ID":"45adc12a-53d2-46dc-a15b-d354387909c2","Type":"ContainerStarted","Data":"bd843e5e445aa322ac9398c6b109235c890e4e6d5966c360995559ed506af3e8"} Apr 23 17:46:38.875253 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.875229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89vtf" event={"ID":"f28b3e97-3375-41b6-8e26-7f03bbe44a6c","Type":"ContainerStarted","Data":"d05b0daf1ecf4d8dbdd9053cb5e292916165ec8afbb0363a043c3837f17362a1"} Apr 23 17:46:38.875377 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.875259 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89vtf" event={"ID":"f28b3e97-3375-41b6-8e26-7f03bbe44a6c","Type":"ContainerStarted","Data":"6bbe60217def53dffc71a18ff21661bd552712dfd53be722e89391e0f73b558d"} Apr 23 17:46:38.875377 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.875275 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89vtf" event={"ID":"f28b3e97-3375-41b6-8e26-7f03bbe44a6c","Type":"ContainerStarted","Data":"aff52cd8941558fd744b9f2decb9a2347100dd64f28c2fcd74d17fb618a14e49"} Apr 23 17:46:38.929252 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:38.929196 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v5kgg" podStartSLOduration=1.929175646 podStartE2EDuration="1.929175646s" podCreationTimestamp="2026-04-23 17:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:46:38.890470471 +0000 UTC m=+330.590071343" watchObservedRunningTime="2026-04-23 17:46:38.929175646 +0000 UTC m=+330.628776520" Apr 23 17:46:40.882113 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:40.882070 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89vtf" event={"ID":"f28b3e97-3375-41b6-8e26-7f03bbe44a6c","Type":"ContainerStarted","Data":"5642ecfc72c2c312ca5b376316bc45cfd8d4a0d39ec2ca57b88172a36d4962b2"} Apr 23 17:46:40.924959 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:40.924887 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-89vtf" podStartSLOduration=1.986363215 podStartE2EDuration="3.924870118s" podCreationTimestamp="2026-04-23 17:46:37 +0000 UTC" firstStartedPulling="2026-04-23 17:46:38.015848612 +0000 UTC m=+329.715449465" lastFinishedPulling="2026-04-23 17:46:39.954355514 +0000 UTC m=+331.653956368" observedRunningTime="2026-04-23 17:46:40.923138624 +0000 UTC m=+332.622739495" watchObservedRunningTime="2026-04-23 17:46:40.924870118 +0000 UTC m=+332.624471047" Apr 23 17:46:44.993902 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:44.993863 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj"] Apr 23 17:46:44.997114 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:44.997086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.001653 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.001632 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 17:46:45.001737 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.001661 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:46:45.024348 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.024319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:46:45.035306 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.035283 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xcc8x\"" Apr 23 17:46:45.050148 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.050122 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj"] Apr 23 17:46:45.085475 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.085439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.085618 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.085492 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.085618 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.085585 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhv5g\" (UniqueName: \"kubernetes.io/projected/2ea54b50-dbce-4156-ab82-8df3cfa10a71-kube-api-access-dhv5g\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.085618 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.085614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea54b50-dbce-4156-ab82-8df3cfa10a71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.140190 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.140158 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ldjsl"] Apr 23 17:46:45.143429 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.143409 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nnznq"] Apr 23 17:46:45.143585 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.143567 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.146521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.146500 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.152318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.152293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:46:45.152438 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.152369 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:46:45.152438 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.152401 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wz9mt\"" Apr 23 17:46:45.152646 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.152628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fbl2d\"" Apr 23 17:46:45.152900 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.152885 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:46:45.153873 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.153856 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:46:45.154148 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.154133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 17:46:45.154871 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.154853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 17:46:45.162763 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.162742 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ldjsl"] Apr 23 17:46:45.187024 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.186989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhv5g\" (UniqueName: \"kubernetes.io/projected/2ea54b50-dbce-4156-ab82-8df3cfa10a71-kube-api-access-dhv5g\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.187024 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.187026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea54b50-dbce-4156-ab82-8df3cfa10a71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.187199 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.187055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.187199 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.187093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.187269 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.187210 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 17:46:45.187304 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.187280 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls podName:2ea54b50-dbce-4156-ab82-8df3cfa10a71 nodeName:}" failed. No retries permitted until 2026-04-23 17:46:45.687258359 +0000 UTC m=+337.386859209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-n6zwj" (UID: "2ea54b50-dbce-4156-ab82-8df3cfa10a71") : secret "openshift-state-metrics-tls" not found Apr 23 17:46:45.187662 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.187643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea54b50-dbce-4156-ab82-8df3cfa10a71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.189669 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.189648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.227779 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.227754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhv5g\" (UniqueName: \"kubernetes.io/projected/2ea54b50-dbce-4156-ab82-8df3cfa10a71-kube-api-access-dhv5g\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.287877 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.287841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-textfile\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.287877 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.287876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.287896 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.287978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-root\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-sys\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288089 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-metrics-client-ca\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mr4\" (UniqueName: \"kubernetes.io/projected/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-api-access-55mr4\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-wtmp\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288292 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288281 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2zq\" (UniqueName: \"kubernetes.io/projected/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-kube-api-access-9t2zq\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.288490 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.288490 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.288334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389476 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2zq\" (UniqueName: \"kubernetes.io/projected/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-kube-api-access-9t2zq\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389476 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.389676 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389676 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.389591 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 17:46:45.389676 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.389652 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls podName:5548baa4-2eb8-441d-a84d-db3b7a2b5e6e nodeName:}" failed. No retries permitted until 2026-04-23 17:46:45.889635109 +0000 UTC m=+337.589235960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls") pod "node-exporter-nnznq" (UID: "5548baa4-2eb8-441d-a84d-db3b7a2b5e6e") : secret "node-exporter-tls" not found Apr 23 17:46:45.389822 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-textfile\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389822 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389822 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.389822 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-root\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-sys\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-metrics-client-ca\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-sys\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390041 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390275 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55mr4\" (UniqueName: \"kubernetes.io/projected/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-api-access-55mr4\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390275 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390275 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-wtmp\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390275 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-textfile\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390399 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-wtmp\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390399 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.389994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-root\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.390399 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390491 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.390407 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 17:46:45.390491 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:45.390462 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls podName:d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68 nodeName:}" failed. No retries permitted until 2026-04-23 17:46:45.890444888 +0000 UTC m=+337.590045743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-ldjsl" (UID: "d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68") : secret "kube-state-metrics-tls" not found Apr 23 17:46:45.390642 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.390782 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-metrics-client-ca\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.391032 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.390894 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.391137 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.391106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.392453 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.392425 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.392670 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.392650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.433286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.433247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2zq\" (UniqueName: \"kubernetes.io/projected/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-kube-api-access-9t2zq\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.434211 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.434190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mr4\" (UniqueName: \"kubernetes.io/projected/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-api-access-55mr4\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.692828 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.692783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.695346 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.695323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea54b50-dbce-4156-ab82-8df3cfa10a71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n6zwj\" (UID: \"2ea54b50-dbce-4156-ab82-8df3cfa10a71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:45.895348 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.895247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.895348 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.895330 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.897996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.897967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5548baa4-2eb8-441d-a84d-db3b7a2b5e6e-node-exporter-tls\") pod \"node-exporter-nnznq\" (UID: \"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e\") " pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:45.898133 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.898009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ldjsl\" (UID: \"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:45.905534 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:45.905505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" Apr 23 17:46:46.043889 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.043863 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj"] Apr 23 17:46:46.046752 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:46.046725 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea54b50_dbce_4156_ab82_8df3cfa10a71.slice/crio-ab2136261a767028b07e239d5dae2645004479328d7b4742b3b22308e3f90db3 WatchSource:0}: Error finding container ab2136261a767028b07e239d5dae2645004479328d7b4742b3b22308e3f90db3: Status 404 returned error can't find the container with id ab2136261a767028b07e239d5dae2645004479328d7b4742b3b22308e3f90db3 Apr 23 17:46:46.053667 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.053645 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" Apr 23 17:46:46.058237 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.058218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nnznq" Apr 23 17:46:46.070159 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:46.070132 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5548baa4_2eb8_441d_a84d_db3b7a2b5e6e.slice/crio-151d3d774afaa5718044a9cd647af0f7199778b0f9f91fc6d80c68d4cdb22015 WatchSource:0}: Error finding container 151d3d774afaa5718044a9cd647af0f7199778b0f9f91fc6d80c68d4cdb22015: Status 404 returned error can't find the container with id 151d3d774afaa5718044a9cd647af0f7199778b0f9f91fc6d80c68d4cdb22015 Apr 23 17:46:46.214842 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.214817 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ldjsl"] Apr 23 17:46:46.219977 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:46.219931 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00b29bf_616c_4a9e_9e3a_3c4c49b1dd68.slice/crio-b8430bf049426c6bd707cab4c27ba878a2b6ef3dbaf037641cb92ff19e27aad9 WatchSource:0}: Error finding container b8430bf049426c6bd707cab4c27ba878a2b6ef3dbaf037641cb92ff19e27aad9: Status 404 returned error can't find the container with id b8430bf049426c6bd707cab4c27ba878a2b6ef3dbaf037641cb92ff19e27aad9 Apr 23 17:46:46.913836 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.911153 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nnznq" event={"ID":"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e","Type":"ContainerStarted","Data":"aca2ce22869a09e4c8a774eb7d20ae8a474ee65eeb86020cb25cc287b1baae37"} Apr 23 17:46:46.913836 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.911195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nnznq" event={"ID":"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e","Type":"ContainerStarted","Data":"151d3d774afaa5718044a9cd647af0f7199778b0f9f91fc6d80c68d4cdb22015"} Apr 23 17:46:46.916742 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.916707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" event={"ID":"2ea54b50-dbce-4156-ab82-8df3cfa10a71","Type":"ContainerStarted","Data":"ca8d46e5d4757b79fb4b5eae61a66fbcbb0121bb8532d1093761b5f191b29070"} Apr 23 17:46:46.916914 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.916750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" event={"ID":"2ea54b50-dbce-4156-ab82-8df3cfa10a71","Type":"ContainerStarted","Data":"960ca9d5fbb9f1fc287f561a52251d12f26d007e524eed1b8ca976df9a430f3a"} Apr 23 17:46:46.916914 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.916764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" event={"ID":"2ea54b50-dbce-4156-ab82-8df3cfa10a71","Type":"ContainerStarted","Data":"ab2136261a767028b07e239d5dae2645004479328d7b4742b3b22308e3f90db3"} Apr 23 17:46:46.918778 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:46.918740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" event={"ID":"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68","Type":"ContainerStarted","Data":"b8430bf049426c6bd707cab4c27ba878a2b6ef3dbaf037641cb92ff19e27aad9"} Apr 23 17:46:47.500432 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.500409 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:46:47.923069 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.923037 2579 generic.go:358] "Generic (PLEG): container finished" podID="5548baa4-2eb8-441d-a84d-db3b7a2b5e6e" containerID="aca2ce22869a09e4c8a774eb7d20ae8a474ee65eeb86020cb25cc287b1baae37" exitCode=0 Apr 23 17:46:47.923270 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.923107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nnznq" event={"ID":"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e","Type":"ContainerDied","Data":"aca2ce22869a09e4c8a774eb7d20ae8a474ee65eeb86020cb25cc287b1baae37"} Apr 23 17:46:47.924925 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.924894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" event={"ID":"2ea54b50-dbce-4156-ab82-8df3cfa10a71","Type":"ContainerStarted","Data":"5936d21534ce2c0cbca932995b9119b79266c04fa04eb577f8eaabeb8a842b3e"} Apr 23 17:46:47.926927 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.926904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" event={"ID":"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68","Type":"ContainerStarted","Data":"b354888f2441a7a4e25be6e4886bc14d1d328ad2b688f20fab4bb5b851d7a7e8"} Apr 23 17:46:47.926927 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.926948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" event={"ID":"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68","Type":"ContainerStarted","Data":"eaabee4fee6c5dec1e77ed221dd1e2616de2fc0ba9670d783e26c177552244aa"} Apr 23 17:46:47.927090 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:47.926962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" event={"ID":"d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68","Type":"ContainerStarted","Data":"2152d8bbbcafc85a0a4e20ec16099fc245e2fe0b3e8377a814a58c28b7cca614"} Apr 23 17:46:48.058135 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:48.058077 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n6zwj" podStartSLOduration=2.806451543 podStartE2EDuration="4.058055311s" podCreationTimestamp="2026-04-23 17:46:44 +0000 UTC" firstStartedPulling="2026-04-23 17:46:46.209766954 +0000 UTC m=+337.909367806" lastFinishedPulling="2026-04-23 17:46:47.461370725 +0000 UTC m=+339.160971574" observedRunningTime="2026-04-23 17:46:48.057347078 +0000 UTC m=+339.756947953" watchObservedRunningTime="2026-04-23 17:46:48.058055311 +0000 UTC m=+339.757656184" Apr 23 17:46:48.058795 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:48.058733 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-ldjsl" podStartSLOduration=1.81713608 podStartE2EDuration="3.058722567s" podCreationTimestamp="2026-04-23 17:46:45 +0000 UTC" firstStartedPulling="2026-04-23 17:46:46.222057035 +0000 UTC m=+337.921657888" lastFinishedPulling="2026-04-23 17:46:47.463643526 +0000 UTC m=+339.163244375" observedRunningTime="2026-04-23 17:46:48.004780374 +0000 UTC m=+339.704381251" watchObservedRunningTime="2026-04-23 17:46:48.058722567 +0000 UTC m=+339.758323441" Apr 23 17:46:48.931392 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:48.931354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nnznq" event={"ID":"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e","Type":"ContainerStarted","Data":"2c875fb5945d60f42d6ae38701e3bfa9865e8fc8e6821a5b1f28d1282a18eec9"} Apr 23 17:46:48.931392 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:48.931395 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nnznq" event={"ID":"5548baa4-2eb8-441d-a84d-db3b7a2b5e6e","Type":"ContainerStarted","Data":"cbdb9b30c61ad9d246b8b7cb7698f6c74129d4aca9289eec340e30af7c13d743"} Apr 23 17:46:48.959860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:48.959808 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nnznq" podStartSLOduration=3.258804401 podStartE2EDuration="3.959794033s" podCreationTimestamp="2026-04-23 17:46:45 +0000 UTC" firstStartedPulling="2026-04-23 17:46:46.071814303 +0000 UTC m=+337.771415154" lastFinishedPulling="2026-04-23 17:46:46.772803935 +0000 UTC m=+338.472404786" observedRunningTime="2026-04-23 17:46:48.959486725 +0000 UTC m=+340.659087598" watchObservedRunningTime="2026-04-23 17:46:48.959794033 +0000 UTC m=+340.659394950" Apr 23 17:46:49.786195 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.786158 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr"] Apr 23 17:46:49.791032 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.790985 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:49.793923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.793901 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zrc8s\"" Apr 23 17:46:49.794356 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.794338 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 17:46:49.802720 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.802696 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr"] Apr 23 17:46:49.911398 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.911360 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6dc9db7797-6hfqn"] Apr 23 17:46:49.914498 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.914482 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:49.918238 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.918213 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 17:46:49.918370 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.918353 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 17:46:49.918606 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.918592 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-79icdga140pe4\"" Apr 23 17:46:49.920695 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.920675 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 17:46:49.920922 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.920839 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-857v9\"" Apr 23 17:46:49.924065 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.924046 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:46:49.930179 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.930161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xjczr\" (UID: \"d8b9985d-f43d-498a-89a5-e00d324d21e2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:49.935696 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:49.935673 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6dc9db7797-6hfqn"] Apr 23 17:46:50.031088 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-client-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031101 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-tls\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031133 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fada3561-13e0-4497-94b2-f78573fc03cc-audit-log\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmdf\" (UniqueName: \"kubernetes.io/projected/fada3561-13e0-4497-94b2-f78573fc03cc-kube-api-access-bzmdf\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031187 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-client-certs\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-metrics-server-audit-profiles\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.031488 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.031324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xjczr\" (UID: \"d8b9985d-f43d-498a-89a5-e00d324d21e2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:50.031488 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:50.031433 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 17:46:50.031546 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:46:50.031493 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert podName:d8b9985d-f43d-498a-89a5-e00d324d21e2 nodeName:}" failed. No retries permitted until 2026-04-23 17:46:50.531476353 +0000 UTC m=+342.231077203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-xjczr" (UID: "d8b9985d-f43d-498a-89a5-e00d324d21e2") : secret "monitoring-plugin-cert" not found Apr 23 17:46:50.132472 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-client-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132472 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-tls\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132472 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132472 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fada3561-13e0-4497-94b2-f78573fc03cc-audit-log\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmdf\" (UniqueName: \"kubernetes.io/projected/fada3561-13e0-4497-94b2-f78573fc03cc-kube-api-access-bzmdf\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-client-certs\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.132748 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.132612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-metrics-server-audit-profiles\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.133186 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.133163 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fada3561-13e0-4497-94b2-f78573fc03cc-audit-log\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.133692 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.133662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.133791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.133709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fada3561-13e0-4497-94b2-f78573fc03cc-metrics-server-audit-profiles\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.135113 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.135074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-client-ca-bundle\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.135234 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.135210 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-client-certs\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.135328 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.135313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fada3561-13e0-4497-94b2-f78573fc03cc-secret-metrics-server-tls\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.144456 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.144429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmdf\" (UniqueName: \"kubernetes.io/projected/fada3561-13e0-4497-94b2-f78573fc03cc-kube-api-access-bzmdf\") pod \"metrics-server-6dc9db7797-6hfqn\" (UID: \"fada3561-13e0-4497-94b2-f78573fc03cc\") " pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.223623 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.223588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:46:50.353063 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.353019 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6dc9db7797-6hfqn"] Apr 23 17:46:50.355455 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:50.355433 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfada3561_13e0_4497_94b2_f78573fc03cc.slice/crio-466d9038861a50c12438ca7483ecb0bf1dadde0ad5c2bf72fe18bd903f422bfa WatchSource:0}: Error finding container 466d9038861a50c12438ca7483ecb0bf1dadde0ad5c2bf72fe18bd903f422bfa: Status 404 returned error can't find the container with id 466d9038861a50c12438ca7483ecb0bf1dadde0ad5c2bf72fe18bd903f422bfa Apr 23 17:46:50.536564 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.536533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xjczr\" (UID: \"d8b9985d-f43d-498a-89a5-e00d324d21e2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:50.539072 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.539053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8b9985d-f43d-498a-89a5-e00d324d21e2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xjczr\" (UID: \"d8b9985d-f43d-498a-89a5-e00d324d21e2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:50.700350 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.700320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:50.856902 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.856821 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr"] Apr 23 17:46:50.860758 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:46:50.860729 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b9985d_f43d_498a_89a5_e00d324d21e2.slice/crio-4b8442c36427da3d94f5456d93c50e613d70edf998311222a1a8b4b743296d16 WatchSource:0}: Error finding container 4b8442c36427da3d94f5456d93c50e613d70edf998311222a1a8b4b743296d16: Status 404 returned error can't find the container with id 4b8442c36427da3d94f5456d93c50e613d70edf998311222a1a8b4b743296d16 Apr 23 17:46:50.938559 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.938508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" event={"ID":"fada3561-13e0-4497-94b2-f78573fc03cc","Type":"ContainerStarted","Data":"466d9038861a50c12438ca7483ecb0bf1dadde0ad5c2bf72fe18bd903f422bfa"} Apr 23 17:46:50.939684 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:50.939657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" event={"ID":"d8b9985d-f43d-498a-89a5-e00d324d21e2","Type":"ContainerStarted","Data":"4b8442c36427da3d94f5456d93c50e613d70edf998311222a1a8b4b743296d16"} Apr 23 17:46:51.944581 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:51.944499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" event={"ID":"fada3561-13e0-4497-94b2-f78573fc03cc","Type":"ContainerStarted","Data":"08da7ece9371a78aa8187089f841b33a7047fe26a5df23d857cd28d3eddb8e96"} Apr 23 17:46:51.974814 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:51.974751 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" podStartSLOduration=1.77688846 podStartE2EDuration="2.974730588s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.357332073 +0000 UTC m=+342.056932924" lastFinishedPulling="2026-04-23 17:46:51.555174203 +0000 UTC m=+343.254775052" observedRunningTime="2026-04-23 17:46:51.972618907 +0000 UTC m=+343.672219784" watchObservedRunningTime="2026-04-23 17:46:51.974730588 +0000 UTC m=+343.674331464" Apr 23 17:46:52.948812 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:52.948773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" event={"ID":"d8b9985d-f43d-498a-89a5-e00d324d21e2","Type":"ContainerStarted","Data":"f2683bbcfef8e60b641a57c442cf09fada4491a3bd2109c36dbd9fc89bd5bb04"} Apr 23 17:46:52.949287 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:52.949079 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:52.953509 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:52.953490 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" Apr 23 17:46:52.967774 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:46:52.967730 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xjczr" podStartSLOduration=2.5182850329999997 podStartE2EDuration="3.967719128s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.863112597 +0000 UTC m=+342.562713448" lastFinishedPulling="2026-04-23 17:46:52.312546691 +0000 UTC m=+344.012147543" observedRunningTime="2026-04-23 17:46:52.965050881 +0000 UTC m=+344.664651754" watchObservedRunningTime="2026-04-23 17:46:52.967719128 +0000 UTC m=+344.667319999" Apr 23 17:47:02.515062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.514995 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" containerID="cri-o://fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f" gracePeriod=30 Apr 23 17:47:02.746271 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.746240 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:47:02.842000 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.841897 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842000 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.841966 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842000 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.841987 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8rk\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842018 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842040 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842084 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842248 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842152 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates\") pod \"ebd07caf-40e6-449a-9045-8ed993d63a23\" (UID: \"ebd07caf-40e6-449a-9045-8ed993d63a23\") " Apr 23 17:47:02.842498 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842353 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:02.842696 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.842669 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:02.844533 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.844500 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:02.844643 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.844579 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:02.844705 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.844652 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:02.844927 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.844885 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk" (OuterVolumeSpecName: "kube-api-access-zv8rk") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "kube-api-access-zv8rk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:02.844927 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.844909 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:02.852421 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.852400 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ebd07caf-40e6-449a-9045-8ed993d63a23" (UID: "ebd07caf-40e6-449a-9045-8ed993d63a23"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:47:02.942999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.942968 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-tls\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.942999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.942995 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-image-registry-private-configuration\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.942999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943005 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebd07caf-40e6-449a-9045-8ed993d63a23-ca-trust-extracted\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.943212 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943015 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-registry-certificates\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.943212 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943025 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebd07caf-40e6-449a-9045-8ed993d63a23-installation-pull-secrets\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.943212 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943033 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd07caf-40e6-449a-9045-8ed993d63a23-trusted-ca\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.943212 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943041 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zv8rk\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-kube-api-access-zv8rk\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.943212 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.943049 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd07caf-40e6-449a-9045-8ed993d63a23-bound-sa-token\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:47:02.977881 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.977844 2579 generic.go:358] "Generic (PLEG): container finished" podID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerID="fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f" exitCode=0 Apr 23 17:47:02.978074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.977910 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" Apr 23 17:47:02.978074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.977925 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerDied","Data":"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f"} Apr 23 17:47:02.978074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.977985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bcfcf8949-44j22" event={"ID":"ebd07caf-40e6-449a-9045-8ed993d63a23","Type":"ContainerDied","Data":"cfef84eab84149abc055c5be4d7e557843b5bc530b97e3e08a4793ee4a32d7ce"} Apr 23 17:47:02.978074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.978001 2579 scope.go:117] "RemoveContainer" containerID="fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f" Apr 23 17:47:02.985778 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.985760 2579 scope.go:117] "RemoveContainer" containerID="ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df" Apr 23 17:47:02.992554 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.992538 2579 scope.go:117] "RemoveContainer" containerID="fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f" Apr 23 17:47:02.992799 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:47:02.992780 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f\": container with ID starting with fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f not found: ID does not exist" containerID="fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f" Apr 23 17:47:02.992863 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.992804 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f"} err="failed to get container status \"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f\": rpc error: code = NotFound desc = could not find container \"fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f\": container with ID starting with fb49282cf3425e36650a51f516804628d08388b5e9b61f6c70ca3e0d6b1f052f not found: ID does not exist" Apr 23 17:47:02.992863 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.992821 2579 scope.go:117] "RemoveContainer" containerID="ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df" Apr 23 17:47:02.993065 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:47:02.993043 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df\": container with ID starting with ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df not found: ID does not exist" containerID="ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df" Apr 23 17:47:02.993118 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.993073 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df"} err="failed to get container status \"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df\": rpc error: code = NotFound desc = could not find container \"ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df\": container with ID starting with ce00cda893d93ed87a2c8c2cb1cff01ead709f37613e9a64bfff76116fe310df not found: ID does not exist" Apr 23 17:47:02.995779 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:02.995758 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:47:03.001365 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:03.001344 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bcfcf8949-44j22"] Apr 23 17:47:04.906191 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:04.906156 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" path="/var/lib/kubelet/pods/ebd07caf-40e6-449a-9045-8ed993d63a23/volumes" Apr 23 17:47:10.224111 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:10.224071 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:47:10.224111 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:10.224118 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:47:30.228923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:30.228882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:47:30.232675 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:47:30.232644 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6dc9db7797-6hfqn" Apr 23 17:49:15.297860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.297821 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298107 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298120 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298134 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298139 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298146 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298152 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298197 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298208 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.298294 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.298215 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:15.301076 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.301053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.304457 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.304439 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 17:49:15.304690 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.304670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 17:49:15.304841 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.304672 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 17:49:15.304996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.304980 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 17:49:15.305183 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.305170 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 17:49:15.305288 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.305264 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5p9qj\"" Apr 23 17:49:15.305391 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.305347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 17:49:15.305456 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.305386 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 17:49:15.321157 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.321129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:15.434805 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.435007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.435007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434843 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.435007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.435007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.435007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.434955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjrq\" (UniqueName: \"kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536211 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536380 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536380 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536380 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536380 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.536380 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.536371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjrq\" (UniqueName: \"kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.537061 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.537032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.537261 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.537060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.537364 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.537100 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.539562 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.539544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.539618 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.539598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.552173 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.552105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjrq\" (UniqueName: \"kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq\") pod \"console-6d8c4dd7b6-rl5bt\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.609928 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.609887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:15.735755 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:15.735717 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:15.738703 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:49:15.738674 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeafba7ae_b7d9_4c18_8102_ad981e91bfc4.slice/crio-64eaf199dd0607457adca7bcabd3296847a0e444deaa2d8c85c4d9ff330ba300 WatchSource:0}: Error finding container 64eaf199dd0607457adca7bcabd3296847a0e444deaa2d8c85c4d9ff330ba300: Status 404 returned error can't find the container with id 64eaf199dd0607457adca7bcabd3296847a0e444deaa2d8c85c4d9ff330ba300 Apr 23 17:49:16.339065 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:16.339022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8c4dd7b6-rl5bt" event={"ID":"eafba7ae-b7d9-4c18-8102-ad981e91bfc4","Type":"ContainerStarted","Data":"64eaf199dd0607457adca7bcabd3296847a0e444deaa2d8c85c4d9ff330ba300"} Apr 23 17:49:17.320591 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.320552 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:49:17.320992 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.320974 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:17.321058 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.320996 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:17.321095 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.321088 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd07caf-40e6-449a-9045-8ed993d63a23" containerName="registry" Apr 23 17:49:17.324230 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.324207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.332184 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.332158 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 17:49:17.334003 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.333977 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:49:17.453293 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmg2r\" (UniqueName: \"kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.453746 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.453539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554364 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.554558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.554545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmg2r\" (UniqueName: \"kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.555295 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.555212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.555295 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.555228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.555508 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.555348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.555508 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.555384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.557244 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.557223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.557569 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.557551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.563981 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.563959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmg2r\" (UniqueName: \"kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r\") pod \"console-97c4c4cc5-qqxzh\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:17.636565 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:17.636478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:18.336216 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:18.336191 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:49:18.338868 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:49:18.338838 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b81a41c_8292_4870_b2a0_c1e1ba38ce3f.slice/crio-5f8a24210906df91f3bc7698f25fa1e02ab33d27226a438252de1e9fc1b406fe WatchSource:0}: Error finding container 5f8a24210906df91f3bc7698f25fa1e02ab33d27226a438252de1e9fc1b406fe: Status 404 returned error can't find the container with id 5f8a24210906df91f3bc7698f25fa1e02ab33d27226a438252de1e9fc1b406fe Apr 23 17:49:18.347561 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:18.347453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97c4c4cc5-qqxzh" event={"ID":"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f","Type":"ContainerStarted","Data":"5f8a24210906df91f3bc7698f25fa1e02ab33d27226a438252de1e9fc1b406fe"} Apr 23 17:49:19.354009 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:19.353968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97c4c4cc5-qqxzh" event={"ID":"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f","Type":"ContainerStarted","Data":"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a"} Apr 23 17:49:19.355195 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:19.355172 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8c4dd7b6-rl5bt" event={"ID":"eafba7ae-b7d9-4c18-8102-ad981e91bfc4","Type":"ContainerStarted","Data":"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d"} Apr 23 17:49:19.376344 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:19.376294 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-97c4c4cc5-qqxzh" podStartSLOduration=2.376278893 podStartE2EDuration="2.376278893s" podCreationTimestamp="2026-04-23 17:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:49:19.376251636 +0000 UTC m=+491.075852509" watchObservedRunningTime="2026-04-23 17:49:19.376278893 +0000 UTC m=+491.075879766" Apr 23 17:49:19.402544 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:19.402493 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d8c4dd7b6-rl5bt" podStartSLOduration=1.8782384460000001 podStartE2EDuration="4.402479737s" podCreationTimestamp="2026-04-23 17:49:15 +0000 UTC" firstStartedPulling="2026-04-23 17:49:15.740415104 +0000 UTC m=+487.440015956" lastFinishedPulling="2026-04-23 17:49:18.264656381 +0000 UTC m=+489.964257247" observedRunningTime="2026-04-23 17:49:19.40105009 +0000 UTC m=+491.100650962" watchObservedRunningTime="2026-04-23 17:49:19.402479737 +0000 UTC m=+491.102080609" Apr 23 17:49:25.610044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:25.610007 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:25.610044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:25.610054 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:25.614798 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:25.614776 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:26.376929 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:26.376894 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:27.637187 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:27.637154 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:27.637617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:27.637231 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:27.642147 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:27.642125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:28.381863 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:28.381831 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:49:28.435632 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:28.435594 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:53.464107 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.464001 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d8c4dd7b6-rl5bt" podUID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" containerName="console" containerID="cri-o://0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d" gracePeriod=15 Apr 23 17:49:53.709374 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.709353 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d8c4dd7b6-rl5bt_eafba7ae-b7d9-4c18-8102-ad981e91bfc4/console/0.log" Apr 23 17:49:53.709484 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.709414 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:53.744923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.744849 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjrq\" (UniqueName: \"kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.744923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.744892 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.744923 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.744919 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.745202 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745030 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.745202 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745072 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.745202 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert\") pod \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\" (UID: \"eafba7ae-b7d9-4c18-8102-ad981e91bfc4\") " Apr 23 17:49:53.745349 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745315 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca" (OuterVolumeSpecName: "service-ca") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:49:53.745419 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:49:53.745537 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.745512 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config" (OuterVolumeSpecName: "console-config") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:49:53.747389 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.747370 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq" (OuterVolumeSpecName: "kube-api-access-fjjrq") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "kube-api-access-fjjrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:49:53.747498 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.747461 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:49:53.747543 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.747527 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eafba7ae-b7d9-4c18-8102-ad981e91bfc4" (UID: "eafba7ae-b7d9-4c18-8102-ad981e91bfc4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:49:53.845797 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845761 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjjrq\" (UniqueName: \"kubernetes.io/projected/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-kube-api-access-fjjrq\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:53.845797 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845792 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-service-ca\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:53.845797 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845801 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-oauth-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:53.845797 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845810 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:53.846078 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845819 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-oauth-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:53.846078 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:53.845827 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eafba7ae-b7d9-4c18-8102-ad981e91bfc4-console-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:49:54.448810 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.448784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d8c4dd7b6-rl5bt_eafba7ae-b7d9-4c18-8102-ad981e91bfc4/console/0.log" Apr 23 17:49:54.448996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.448826 2579 generic.go:358] "Generic (PLEG): container finished" podID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" containerID="0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d" exitCode=2 Apr 23 17:49:54.448996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.448899 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8c4dd7b6-rl5bt" Apr 23 17:49:54.448996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.448898 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8c4dd7b6-rl5bt" event={"ID":"eafba7ae-b7d9-4c18-8102-ad981e91bfc4","Type":"ContainerDied","Data":"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d"} Apr 23 17:49:54.448996 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.448986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8c4dd7b6-rl5bt" event={"ID":"eafba7ae-b7d9-4c18-8102-ad981e91bfc4","Type":"ContainerDied","Data":"64eaf199dd0607457adca7bcabd3296847a0e444deaa2d8c85c4d9ff330ba300"} Apr 23 17:49:54.449132 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.449008 2579 scope.go:117] "RemoveContainer" containerID="0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d" Apr 23 17:49:54.457172 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.457146 2579 scope.go:117] "RemoveContainer" containerID="0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d" Apr 23 17:49:54.457447 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:49:54.457428 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d\": container with ID starting with 0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d not found: ID does not exist" containerID="0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d" Apr 23 17:49:54.457509 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.457458 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d"} err="failed to get container status \"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d\": rpc error: code = NotFound desc = could not find container \"0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d\": container with ID starting with 0eb96934249f816d335eb9d70f1dccd30ff5ad73ba68ad3f07322db0c171335d not found: ID does not exist" Apr 23 17:49:54.468835 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.468807 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:54.474068 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.474044 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d8c4dd7b6-rl5bt"] Apr 23 17:49:54.906454 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:49:54.906420 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" path="/var/lib/kubelet/pods/eafba7ae-b7d9-4c18-8102-ad981e91bfc4/volumes" Apr 23 17:51:08.803296 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:08.803265 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:51:08.804227 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:08.804205 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:51:08.806374 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:08.806352 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:51:08.806989 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:08.806975 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:51:49.122733 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.122697 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:51:49.123298 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.123142 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" containerName="console" Apr 23 17:51:49.123298 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.123163 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" containerName="console" Apr 23 17:51:49.123298 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.123249 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="eafba7ae-b7d9-4c18-8102-ad981e91bfc4" containerName="console" Apr 23 17:51:49.126558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.126537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.135855 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.135837 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 17:51:49.135977 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.135918 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 17:51:49.136035 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136019 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 17:51:49.136078 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 17:51:49.136126 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136107 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 17:51:49.136261 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136248 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 17:51:49.136313 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136302 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 17:51:49.136737 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-tx7cw\"" Apr 23 17:51:49.136780 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.136758 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 17:51:49.140183 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.140167 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 17:51:49.180347 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.180316 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:51:49.228999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.228964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.228999 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.228998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll4m\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-kube-api-access-nll4m\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-out\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-web-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229436 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229436 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229436 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229436 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-volume\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.229436 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.229392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.330794 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.330794 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331047 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331047 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330901 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nll4m\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-kube-api-access-nll4m\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331047 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-out\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331047 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.330973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331047 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-web-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-volume\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331286 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331581 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.331581 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.331550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.333074 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.333048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3808bbb3-777a-4aa1-b625-18d13b2fef7e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334293 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334250 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334555 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334656 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-out\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334804 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334872 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.334959 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.334914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.335142 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.335126 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.335724 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.335697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-web-config\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.335894 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.335879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3808bbb3-777a-4aa1-b625-18d13b2fef7e-config-volume\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.339629 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.339607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll4m\" (UniqueName: \"kubernetes.io/projected/3808bbb3-777a-4aa1-b625-18d13b2fef7e-kube-api-access-nll4m\") pod \"alertmanager-main-0\" (UID: \"3808bbb3-777a-4aa1-b625-18d13b2fef7e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.435315 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.435217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:51:49.567136 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.567112 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:51:49.569660 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:51:49.569631 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3808bbb3_777a_4aa1_b625_18d13b2fef7e.slice/crio-ee463e33750596d41c58665723c33bb0409e2f49af66b1048c904a3f9d44130f WatchSource:0}: Error finding container ee463e33750596d41c58665723c33bb0409e2f49af66b1048c904a3f9d44130f: Status 404 returned error can't find the container with id ee463e33750596d41c58665723c33bb0409e2f49af66b1048c904a3f9d44130f Apr 23 17:51:49.571438 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.571424 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:51:49.768575 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:49.768536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"ee463e33750596d41c58665723c33bb0409e2f49af66b1048c904a3f9d44130f"} Apr 23 17:51:50.118726 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.118651 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6885598b6-lfxsh"] Apr 23 17:51:50.122255 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.122237 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.125121 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125099 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 17:51:50.125441 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125122 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 17:51:50.125441 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125425 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 17:51:50.125595 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125448 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 17:51:50.125595 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125460 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a5r96kbnji4t3\"" Apr 23 17:51:50.125595 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125454 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 17:51:50.125595 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.125478 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mfhxp\"" Apr 23 17:51:50.137760 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.137738 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6885598b6-lfxsh"] Apr 23 17:51:50.239895 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.239857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkqh\" (UniqueName: \"kubernetes.io/projected/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-kube-api-access-czkqh\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.239895 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.239895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.239954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.240017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.240056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.240078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-metrics-client-ca\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240327 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.240158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-grpc-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.240327 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.240190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.341532 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.341482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.341712 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.341538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-metrics-client-ca\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.341712 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.341583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-grpc-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.341712 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.341609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.342315 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.341997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czkqh\" (UniqueName: \"kubernetes.io/projected/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-kube-api-access-czkqh\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.342315 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.342049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.342315 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.342114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.342315 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.342157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.343723 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.343247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-metrics-client-ca\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.345148 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.345120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.345289 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.345238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-grpc-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.345693 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.345670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.346077 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.346054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.346196 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.346176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.346851 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.346808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-secret-thanos-querier-tls\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.351176 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.351155 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkqh\" (UniqueName: \"kubernetes.io/projected/3e23db1e-dda5-495a-9cdb-d49b902d0e8f-kube-api-access-czkqh\") pod \"thanos-querier-6885598b6-lfxsh\" (UID: \"3e23db1e-dda5-495a-9cdb-d49b902d0e8f\") " pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.432267 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.432239 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:50.564204 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.564176 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6885598b6-lfxsh"] Apr 23 17:51:50.570877 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:51:50.570848 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e23db1e_dda5_495a_9cdb_d49b902d0e8f.slice/crio-3252cef40d503f6c05355b9cb6c8dd687d03f91bab467c1451df2740d9b64e47 WatchSource:0}: Error finding container 3252cef40d503f6c05355b9cb6c8dd687d03f91bab467c1451df2740d9b64e47: Status 404 returned error can't find the container with id 3252cef40d503f6c05355b9cb6c8dd687d03f91bab467c1451df2740d9b64e47 Apr 23 17:51:50.772145 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.772104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"3252cef40d503f6c05355b9cb6c8dd687d03f91bab467c1451df2740d9b64e47"} Apr 23 17:51:50.773425 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.773395 2579 generic.go:358] "Generic (PLEG): container finished" podID="3808bbb3-777a-4aa1-b625-18d13b2fef7e" containerID="f6f56a7decef467a81b788e1ade4ddb2e9454fd6d1ed5b40e75f5e7450518e4d" exitCode=0 Apr 23 17:51:50.773550 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:50.773470 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerDied","Data":"f6f56a7decef467a81b788e1ade4ddb2e9454fd6d1ed5b40e75f5e7450518e4d"} Apr 23 17:51:52.782089 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.782052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"71f3821598740d08c4d1ba1006e285e2d9191bec803d5611add8c67ab284ca2a"} Apr 23 17:51:52.782459 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.782101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"f7648bec8ba5d599a3d00f755b07021e79b1f5b3f8f571d1529bfcc09ea9aeca"} Apr 23 17:51:52.782459 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.782118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"40ae0e4addf245e8881e5ba1aae4d27a42cd9fd2514db48e39a9f941bb815249"} Apr 23 17:51:52.784693 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.784633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"194bc531256f72553634fadb314c66302982715a1b1b8a12247ac7fad1111f4b"} Apr 23 17:51:52.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.784692 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"0e238d6c0c3e7f27cee485ff8ff48e2b81a480321436997cdfe87edcc1165b27"} Apr 23 17:51:52.784823 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.784712 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"57a605c27b565539900f6dbd82c7987250c8cf8dc0b0297d5abaf301217481f7"} Apr 23 17:51:52.954284 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.954244 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8"] Apr 23 17:51:52.958124 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.958105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:52.960334 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.960311 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 17:51:52.961464 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.961443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 17:51:52.961601 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.961583 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-45x59\"" Apr 23 17:51:52.962236 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.962219 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 17:51:52.962535 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.962518 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 17:51:52.964567 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.964540 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 17:51:52.968755 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.968736 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 17:51:52.978558 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:52.978533 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8"] Apr 23 17:51:53.074322 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074322 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-serving-certs-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074322 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074593 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-federate-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074593 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074458 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-metrics-client-ca\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074593 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074521 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074593 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074553 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.074593 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.074586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8kn\" (UniqueName: \"kubernetes.io/projected/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-kube-api-access-vg8kn\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175163 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-federate-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175250 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-metrics-client-ca\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175502 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8kn\" (UniqueName: \"kubernetes.io/projected/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-kube-api-access-vg8kn\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175502 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.175502 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.175394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-serving-certs-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.176201 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.176136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-serving-certs-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.176420 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.176365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.177099 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.177058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-metrics-client-ca\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.178197 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.178169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.178509 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.178490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-federate-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.178995 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.178972 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.179085 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.179005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-telemeter-client-tls\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.184123 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.184102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8kn\" (UniqueName: \"kubernetes.io/projected/bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1-kube-api-access-vg8kn\") pod \"telemeter-client-6d98bc84f8-p6fv8\" (UID: \"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1\") " pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.268563 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.268522 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" Apr 23 17:51:53.423221 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.423157 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8"] Apr 23 17:51:53.425642 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:51:53.425515 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1a8e7a_74a5_4fce_ad08_c6a765d4adc1.slice/crio-dba3265e65d781000aea4d99693966b07bdda3e493588fda3a00a49155c8e13e WatchSource:0}: Error finding container dba3265e65d781000aea4d99693966b07bdda3e493588fda3a00a49155c8e13e: Status 404 returned error can't find the container with id dba3265e65d781000aea4d99693966b07bdda3e493588fda3a00a49155c8e13e Apr 23 17:51:53.788282 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.788241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" event={"ID":"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1","Type":"ContainerStarted","Data":"dba3265e65d781000aea4d99693966b07bdda3e493588fda3a00a49155c8e13e"} Apr 23 17:51:53.790547 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.790508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"4c1aeb2e9f7e8339c1be37fc4f890c00cd40942e0fb5e53a34d7581605148374"} Apr 23 17:51:53.790547 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.790542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"d4419ddb1cd2e68fb2ba560cd8aba9eb0e943aaafccceef2e18199323b831c0b"} Apr 23 17:51:53.790547 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.790552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" event={"ID":"3e23db1e-dda5-495a-9cdb-d49b902d0e8f","Type":"ContainerStarted","Data":"813d37c79a387140199e128b2a5d871da1eedb3e73c452b17959c575e73c2b5b"} Apr 23 17:51:53.790843 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.790783 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:51:53.792875 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.792844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"286ac2987d013ea87f6614ef5b3c7ae1732cd7d96fc263a01182d20b350374c9"} Apr 23 17:51:53.792875 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.792873 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"6f84db75c42b4b9b0136c4b37d24bf5bddd40a45fb19c4a736944f3b9131322e"} Apr 23 17:51:53.793042 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.792886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3808bbb3-777a-4aa1-b625-18d13b2fef7e","Type":"ContainerStarted","Data":"5556ad07b1dcb6e3c83b278d4bed7b671403fb58376371f7992479e968c2897f"} Apr 23 17:51:53.820539 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.820494 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" podStartSLOduration=1.029859409 podStartE2EDuration="3.820481142s" podCreationTimestamp="2026-04-23 17:51:50 +0000 UTC" firstStartedPulling="2026-04-23 17:51:50.572635727 +0000 UTC m=+642.272236580" lastFinishedPulling="2026-04-23 17:51:53.363257448 +0000 UTC m=+645.062858313" observedRunningTime="2026-04-23 17:51:53.81819431 +0000 UTC m=+645.517795182" watchObservedRunningTime="2026-04-23 17:51:53.820481142 +0000 UTC m=+645.520082014" Apr 23 17:51:53.861865 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:53.861802 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.069298709 podStartE2EDuration="4.861782434s" podCreationTimestamp="2026-04-23 17:51:49 +0000 UTC" firstStartedPulling="2026-04-23 17:51:49.5715581 +0000 UTC m=+641.271158952" lastFinishedPulling="2026-04-23 17:51:53.364041813 +0000 UTC m=+645.063642677" observedRunningTime="2026-04-23 17:51:53.858492019 +0000 UTC m=+645.558092902" watchObservedRunningTime="2026-04-23 17:51:53.861782434 +0000 UTC m=+645.561383308" Apr 23 17:51:54.322596 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.322565 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:51:54.327279 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.327251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.329844 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.329817 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:51:54.330028 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.329853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vckj6\"" Apr 23 17:51:54.330028 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.329889 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:51:54.330028 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.329902 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:51:54.330244 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330230 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:51:54.330297 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330266 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:51:54.330570 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330552 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:51:54.330692 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330580 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:51:54.330692 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330598 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:51:54.330692 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330604 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:51:54.330950 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330920 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:51:54.331055 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.330982 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9a1v5evqrm0dr\"" Apr 23 17:51:54.338770 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.338743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:51:54.344300 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.343090 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:51:54.344300 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.343179 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:51:54.488901 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.488856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.488901 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.488899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-web-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.488929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.488979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gh5\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-kube-api-access-27gh5\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489131 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489081 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489345 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489345 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489345 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489345 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-config-out\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489345 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489468 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489499 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.489737 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.489547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590272 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590272 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590272 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-web-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27gh5\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-kube-api-access-27gh5\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.590545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-config-out\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.590659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591444 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.591415 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591744 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.591686 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591744 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.591709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.591847 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.591766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.592870 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.592529 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.593832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.593845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-web-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594155 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.593956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594721 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.594430 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594825 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.594785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.594889 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.594852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/464064e6-cf95-41e9-a8bb-2f29be481bc8-config-out\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.596671 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.596648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/464064e6-cf95-41e9-a8bb-2f29be481bc8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.596855 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.596783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.597311 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.597269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.597566 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.597543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-config\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.597767 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.597744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/464064e6-cf95-41e9-a8bb-2f29be481bc8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.598352 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.598330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.601246 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.601225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gh5\" (UniqueName: \"kubernetes.io/projected/464064e6-cf95-41e9-a8bb-2f29be481bc8-kube-api-access-27gh5\") pod \"prometheus-k8s-0\" (UID: \"464064e6-cf95-41e9-a8bb-2f29be481bc8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.647333 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.647288 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:54.951134 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:54.951106 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:51:54.953141 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:51:54.953113 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod464064e6_cf95_41e9_a8bb_2f29be481bc8.slice/crio-ebf4da79fd965695c867379ab5a46c023045beee3dbcaf7582f0b7c986854ca4 WatchSource:0}: Error finding container ebf4da79fd965695c867379ab5a46c023045beee3dbcaf7582f0b7c986854ca4: Status 404 returned error can't find the container with id ebf4da79fd965695c867379ab5a46c023045beee3dbcaf7582f0b7c986854ca4 Apr 23 17:51:55.802301 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.802268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" event={"ID":"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1","Type":"ContainerStarted","Data":"8babb2422e890ed1325f248ba1761c41ea7d9cdca950215b018465924022e45d"} Apr 23 17:51:55.802301 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.802302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" event={"ID":"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1","Type":"ContainerStarted","Data":"4c12abced309d17fe0fd385179090ab6688452c4eb4b90db6c080a3bc8d252ea"} Apr 23 17:51:55.802301 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.802312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" event={"ID":"bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1","Type":"ContainerStarted","Data":"f6126cb1f3ba61a3ce2ae69d9672d30f15756716ca5757c8d532258e1539c503"} Apr 23 17:51:55.803623 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.803598 2579 generic.go:358] "Generic (PLEG): container finished" podID="464064e6-cf95-41e9-a8bb-2f29be481bc8" containerID="f20b56cab7985a2895bc5f8b20f0988f7769adaf7ac526adac2f3fe1c20c17a3" exitCode=0 Apr 23 17:51:55.803725 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.803648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerDied","Data":"f20b56cab7985a2895bc5f8b20f0988f7769adaf7ac526adac2f3fe1c20c17a3"} Apr 23 17:51:55.803725 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.803676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"ebf4da79fd965695c867379ab5a46c023045beee3dbcaf7582f0b7c986854ca4"} Apr 23 17:51:55.827050 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:55.827004 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d98bc84f8-p6fv8" podStartSLOduration=2.3898424240000002 podStartE2EDuration="3.826991664s" podCreationTimestamp="2026-04-23 17:51:52 +0000 UTC" firstStartedPulling="2026-04-23 17:51:53.427848267 +0000 UTC m=+645.127449128" lastFinishedPulling="2026-04-23 17:51:54.864997518 +0000 UTC m=+646.564598368" observedRunningTime="2026-04-23 17:51:55.825285254 +0000 UTC m=+647.524886125" watchObservedRunningTime="2026-04-23 17:51:55.826991664 +0000 UTC m=+647.526592536" Apr 23 17:51:56.524626 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.524590 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:51:56.529003 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.528902 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.545280 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.545255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:51:56.611352 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hz84\" (UniqueName: \"kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611545 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.611853 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.611581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712317 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712491 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712491 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712491 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712491 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hz84\" (UniqueName: \"kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712676 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.712676 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.712515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.713347 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.713202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.713347 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.713291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.713513 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.713358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.713699 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.713674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.715540 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.715516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.715649 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.715572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.721642 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.721619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hz84\" (UniqueName: \"kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84\") pod \"console-6c49bcb68b-hxmdm\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.841358 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.841269 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:51:56.991743 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:56.991717 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:51:56.994450 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:51:56.994407 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed95a4c_8b4c_4089_b7d7_077d24ebc134.slice/crio-65660a4d37c1f38888bcff0e19db5d88dc36405c0a4830110e1fc99a8d4001a8 WatchSource:0}: Error finding container 65660a4d37c1f38888bcff0e19db5d88dc36405c0a4830110e1fc99a8d4001a8: Status 404 returned error can't find the container with id 65660a4d37c1f38888bcff0e19db5d88dc36405c0a4830110e1fc99a8d4001a8 Apr 23 17:51:57.812642 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:57.812592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c49bcb68b-hxmdm" event={"ID":"5ed95a4c-8b4c-4089-b7d7-077d24ebc134","Type":"ContainerStarted","Data":"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f"} Apr 23 17:51:57.812642 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:57.812640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c49bcb68b-hxmdm" event={"ID":"5ed95a4c-8b4c-4089-b7d7-077d24ebc134","Type":"ContainerStarted","Data":"65660a4d37c1f38888bcff0e19db5d88dc36405c0a4830110e1fc99a8d4001a8"} Apr 23 17:51:57.834738 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:57.834671 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c49bcb68b-hxmdm" podStartSLOduration=1.8346513290000002 podStartE2EDuration="1.834651329s" podCreationTimestamp="2026-04-23 17:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:51:57.832382396 +0000 UTC m=+649.531983267" watchObservedRunningTime="2026-04-23 17:51:57.834651329 +0000 UTC m=+649.534252202" Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818529 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"bccc0414129b31947abca93aebaccf9abcff5d3b3ce92e3dcaa7d66db080a8f1"} Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"87d9aab233d59c08ec7b171aab0a451795124f1c4a11c480aedfc2db8842663d"} Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818576 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"ace03819f3fd94d2ed8e10b5997f59bdfedffa19af723f9d98c314d211696ce6"} Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818584 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"55c3876ca772772a82016b382e5418fe0f3446c456e524d0689d2952427b42fa"} Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"8a89e1d9059b4166519159721d87e9365b874fe9ec8f8ac2a153097502a36138"} Apr 23 17:51:58.818617 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.818603 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"464064e6-cf95-41e9-a8bb-2f29be481bc8","Type":"ContainerStarted","Data":"166579c88bafbbaab61550e643b63d996a4ee7869d963b0dee404622058f1153"} Apr 23 17:51:58.851224 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:58.851169 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.37135249 podStartE2EDuration="4.851148324s" podCreationTimestamp="2026-04-23 17:51:54 +0000 UTC" firstStartedPulling="2026-04-23 17:51:55.804733113 +0000 UTC m=+647.504333964" lastFinishedPulling="2026-04-23 17:51:58.284528949 +0000 UTC m=+649.984129798" observedRunningTime="2026-04-23 17:51:58.848895859 +0000 UTC m=+650.548496733" watchObservedRunningTime="2026-04-23 17:51:58.851148324 +0000 UTC m=+650.550749196" Apr 23 17:51:59.647983 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:59.647926 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:51:59.802806 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:51:59.802778 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6885598b6-lfxsh" Apr 23 17:52:06.841512 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:06.841467 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:52:06.841983 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:06.841524 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:52:06.846646 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:06.846621 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:52:07.849965 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:07.849919 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:52:07.907782 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:07.907737 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:52:32.933504 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:32.933456 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-97c4c4cc5-qqxzh" podUID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" containerName="console" containerID="cri-o://15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a" gracePeriod=15 Apr 23 17:52:33.167333 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.167310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97c4c4cc5-qqxzh_5b81a41c-8292-4870-b2a0-c1e1ba38ce3f/console/0.log" Apr 23 17:52:33.167467 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.167380 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:52:33.243644 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243564 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243644 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243636 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243677 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243733 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243757 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmg2r\" (UniqueName: \"kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243790 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.243818 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.243814 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config\") pod \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\" (UID: \"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f\") " Apr 23 17:52:33.244261 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.244216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:52:33.244261 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.244238 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:52:33.244679 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.244279 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:52:33.244679 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.244311 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config" (OuterVolumeSpecName: "console-config") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:52:33.246086 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.246061 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:52:33.246182 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.246154 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:52:33.246182 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.246158 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r" (OuterVolumeSpecName: "kube-api-access-nmg2r") pod "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" (UID: "5b81a41c-8292-4870-b2a0-c1e1ba38ce3f"). InnerVolumeSpecName "kube-api-access-nmg2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:52:33.344791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344751 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.344791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344783 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-oauth-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.344791 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344792 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-trusted-ca-bundle\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.345044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344804 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-oauth-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.345044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344815 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmg2r\" (UniqueName: \"kubernetes.io/projected/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-kube-api-access-nmg2r\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.345044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344824 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-service-ca\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.345044 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.344833 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f-console-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:52:33.926857 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926831 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97c4c4cc5-qqxzh_5b81a41c-8292-4870-b2a0-c1e1ba38ce3f/console/0.log" Apr 23 17:52:33.927062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926871 2579 generic.go:358] "Generic (PLEG): container finished" podID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" containerID="15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a" exitCode=2 Apr 23 17:52:33.927062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97c4c4cc5-qqxzh" event={"ID":"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f","Type":"ContainerDied","Data":"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a"} Apr 23 17:52:33.927062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97c4c4cc5-qqxzh" event={"ID":"5b81a41c-8292-4870-b2a0-c1e1ba38ce3f","Type":"ContainerDied","Data":"5f8a24210906df91f3bc7698f25fa1e02ab33d27226a438252de1e9fc1b406fe"} Apr 23 17:52:33.927062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926967 2579 scope.go:117] "RemoveContainer" containerID="15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a" Apr 23 17:52:33.927062 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.926969 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97c4c4cc5-qqxzh" Apr 23 17:52:33.936113 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.935825 2579 scope.go:117] "RemoveContainer" containerID="15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a" Apr 23 17:52:33.936354 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:52:33.936188 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a\": container with ID starting with 15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a not found: ID does not exist" containerID="15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a" Apr 23 17:52:33.936354 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.936211 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a"} err="failed to get container status \"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a\": rpc error: code = NotFound desc = could not find container \"15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a\": container with ID starting with 15b58b094b8a6a1dd8642fdf5e7c7a8937a863c5faf53dd734b9a5443fca919a not found: ID does not exist" Apr 23 17:52:33.949312 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.949283 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:52:33.953526 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:33.953502 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-97c4c4cc5-qqxzh"] Apr 23 17:52:34.911860 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:34.911825 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" path="/var/lib/kubelet/pods/5b81a41c-8292-4870-b2a0-c1e1ba38ce3f/volumes" Apr 23 17:52:54.647908 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:54.647820 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:52:54.668318 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:54.668290 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:52:55.005463 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:52:55.005431 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:53:11.901284 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.901202 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 17:53:11.901841 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.901817 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" containerName="console" Apr 23 17:53:11.901841 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.901844 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" containerName="console" Apr 23 17:53:11.902031 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.901920 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b81a41c-8292-4870-b2a0-c1e1ba38ce3f" containerName="console" Apr 23 17:53:11.905085 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.905058 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.915786 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.915756 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 17:53:11.997980 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.997909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998162 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2kz\" (UniqueName: \"kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998277 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998448 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998483 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:11.998513 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:11.998490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100096 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100265 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100265 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100265 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100265 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100512 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.100512 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.100481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2kz\" (UniqueName: \"kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.101142 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.101118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.101237 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.101140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.101237 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.101118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.101237 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.101212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.102696 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.102670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.102813 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.102795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.109287 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.109264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2kz\" (UniqueName: \"kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz\") pod \"console-5fcd5b6589-76pgw\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.216918 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.216820 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:12.353603 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:12.353543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 17:53:12.356771 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:53:12.356745 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0069f6e0_8acc_4ff5_b509_78ba3d1cb0ac.slice/crio-b7bd90f64e4f8d5290fdab3e9c1c05a1f7dd9ea9721e60278131f39169c58a21 WatchSource:0}: Error finding container b7bd90f64e4f8d5290fdab3e9c1c05a1f7dd9ea9721e60278131f39169c58a21: Status 404 returned error can't find the container with id b7bd90f64e4f8d5290fdab3e9c1c05a1f7dd9ea9721e60278131f39169c58a21 Apr 23 17:53:13.046080 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:13.046037 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcd5b6589-76pgw" event={"ID":"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac","Type":"ContainerStarted","Data":"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf"} Apr 23 17:53:13.046080 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:13.046086 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcd5b6589-76pgw" event={"ID":"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac","Type":"ContainerStarted","Data":"b7bd90f64e4f8d5290fdab3e9c1c05a1f7dd9ea9721e60278131f39169c58a21"} Apr 23 17:53:13.064690 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:13.064641 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fcd5b6589-76pgw" podStartSLOduration=2.064625349 podStartE2EDuration="2.064625349s" podCreationTimestamp="2026-04-23 17:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:53:13.06353226 +0000 UTC m=+724.763133132" watchObservedRunningTime="2026-04-23 17:53:13.064625349 +0000 UTC m=+724.764226220" Apr 23 17:53:22.217169 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:22.217129 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:22.217169 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:22.217180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:22.221854 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:22.221828 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:23.081653 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:23.081626 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 17:53:23.130681 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:23.130645 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:53:48.152362 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.152307 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c49bcb68b-hxmdm" podUID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" containerName="console" containerID="cri-o://dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f" gracePeriod=15 Apr 23 17:53:48.398289 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.398263 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c49bcb68b-hxmdm_5ed95a4c-8b4c-4089-b7d7-077d24ebc134/console/0.log" Apr 23 17:53:48.398420 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.398326 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:53:48.431521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431449 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431521 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431500 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431726 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431528 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431726 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431656 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431726 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431695 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431877 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431761 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hz84\" (UniqueName: \"kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.431877 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431788 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config\") pod \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\" (UID: \"5ed95a4c-8b4c-4089-b7d7-077d24ebc134\") " Apr 23 17:53:48.432007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431908 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:53:48.432007 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.431998 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config" (OuterVolumeSpecName: "console-config") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:53:48.432273 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.432213 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca" (OuterVolumeSpecName: "service-ca") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:53:48.432273 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.432222 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-trusted-ca-bundle\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.432273 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.432224 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:53:48.432273 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.432249 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.433836 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.433813 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:53:48.434601 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.434584 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:53:48.434687 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.434599 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84" (OuterVolumeSpecName: "kube-api-access-9hz84") pod "5ed95a4c-8b4c-4089-b7d7-077d24ebc134" (UID: "5ed95a4c-8b4c-4089-b7d7-077d24ebc134"). InnerVolumeSpecName "kube-api-access-9hz84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:53:48.532909 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.532880 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9hz84\" (UniqueName: \"kubernetes.io/projected/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-kube-api-access-9hz84\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.532909 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.532908 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-oauth-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.532909 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.532918 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-console-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.533149 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.532928 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-service-ca\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:48.533149 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:48.532950 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ed95a4c-8b4c-4089-b7d7-077d24ebc134-oauth-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 17:53:49.155614 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155586 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c49bcb68b-hxmdm_5ed95a4c-8b4c-4089-b7d7-077d24ebc134/console/0.log" Apr 23 17:53:49.156004 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155628 2579 generic.go:358] "Generic (PLEG): container finished" podID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" containerID="dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f" exitCode=2 Apr 23 17:53:49.156004 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155713 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c49bcb68b-hxmdm" Apr 23 17:53:49.156004 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c49bcb68b-hxmdm" event={"ID":"5ed95a4c-8b4c-4089-b7d7-077d24ebc134","Type":"ContainerDied","Data":"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f"} Apr 23 17:53:49.156004 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c49bcb68b-hxmdm" event={"ID":"5ed95a4c-8b4c-4089-b7d7-077d24ebc134","Type":"ContainerDied","Data":"65660a4d37c1f38888bcff0e19db5d88dc36405c0a4830110e1fc99a8d4001a8"} Apr 23 17:53:49.156004 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.155766 2579 scope.go:117] "RemoveContainer" containerID="dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f" Apr 23 17:53:49.163893 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.163860 2579 scope.go:117] "RemoveContainer" containerID="dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f" Apr 23 17:53:49.164146 ip-10-0-131-107 kubenswrapper[2579]: E0423 17:53:49.164125 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f\": container with ID starting with dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f not found: ID does not exist" containerID="dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f" Apr 23 17:53:49.164214 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.164154 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f"} err="failed to get container status \"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f\": rpc error: code = NotFound desc = could not find container \"dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f\": container with ID starting with dba1647a6c471c03fe432c06dc5eacb91d46e272e446757b08979ddbc998ed0f not found: ID does not exist" Apr 23 17:53:49.173917 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.173895 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:53:49.178040 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:49.178020 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c49bcb68b-hxmdm"] Apr 23 17:53:50.906603 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:53:50.906558 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" path="/var/lib/kubelet/pods/5ed95a4c-8b4c-4089-b7d7-077d24ebc134/volumes" Apr 23 17:54:06.547519 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.547482 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-26hlr"] Apr 23 17:54:06.547883 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.547846 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" containerName="console" Apr 23 17:54:06.547883 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.547859 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" containerName="console" Apr 23 17:54:06.547987 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.547927 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ed95a4c-8b4c-4089-b7d7-077d24ebc134" containerName="console" Apr 23 17:54:06.551157 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.551140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.553784 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.553760 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:54:06.559893 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.559869 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-26hlr"] Apr 23 17:54:06.699260 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.699223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-original-pull-secret\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.699428 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.699294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-dbus\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.699428 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.699322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-kubelet-config\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.800417 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.800320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-original-pull-secret\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.800417 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.800374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-dbus\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.800417 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.800395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-kubelet-config\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.800651 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.800475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-kubelet-config\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.800651 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.800607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-dbus\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.802809 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.802789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a457eef4-9f3f-4f73-9ca1-5086f6adf4d5-original-pull-secret\") pod \"global-pull-secret-syncer-26hlr\" (UID: \"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5\") " pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.861025 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.860988 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26hlr" Apr 23 17:54:06.989728 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:06.989700 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-26hlr"] Apr 23 17:54:06.992410 ip-10-0-131-107 kubenswrapper[2579]: W0423 17:54:06.992381 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda457eef4_9f3f_4f73_9ca1_5086f6adf4d5.slice/crio-10973584b7d80095bdb5949a735af240a102be834da6aa2cbab111fa3cc7e537 WatchSource:0}: Error finding container 10973584b7d80095bdb5949a735af240a102be834da6aa2cbab111fa3cc7e537: Status 404 returned error can't find the container with id 10973584b7d80095bdb5949a735af240a102be834da6aa2cbab111fa3cc7e537 Apr 23 17:54:07.217392 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:07.217306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-26hlr" event={"ID":"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5","Type":"ContainerStarted","Data":"10973584b7d80095bdb5949a735af240a102be834da6aa2cbab111fa3cc7e537"} Apr 23 17:54:11.232144 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:11.232106 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-26hlr" event={"ID":"a457eef4-9f3f-4f73-9ca1-5086f6adf4d5","Type":"ContainerStarted","Data":"b313f80df342a3f3f1f6876ac0b06b7adc26c73b72ccaaa48929db1e6d215ff5"} Apr 23 17:54:11.252166 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:54:11.252119 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-26hlr" podStartSLOduration=1.594229754 podStartE2EDuration="5.252104747s" podCreationTimestamp="2026-04-23 17:54:06 +0000 UTC" firstStartedPulling="2026-04-23 17:54:06.994075497 +0000 UTC m=+778.693676347" lastFinishedPulling="2026-04-23 17:54:10.651950487 +0000 UTC m=+782.351551340" observedRunningTime="2026-04-23 17:54:11.251294891 +0000 UTC m=+782.950895853" watchObservedRunningTime="2026-04-23 17:54:11.252104747 +0000 UTC m=+782.951706026" Apr 23 17:56:08.830837 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:56:08.830798 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:56:08.832827 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:56:08.832798 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 17:56:08.833814 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:56:08.833797 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 17:56:08.835557 ip-10-0-131-107 kubenswrapper[2579]: I0423 17:56:08.835538 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:01:08.854688 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:01:08.854660 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:01:08.857199 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:01:08.857077 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:01:08.857707 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:01:08.857692 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:01:08.860144 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:01:08.860122 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:02:20.423526 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.423486 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75c8bc9448-wmpdj"] Apr 23 18:02:20.427284 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.427250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.446453 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.446418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75c8bc9448-wmpdj"] Apr 23 18:02:20.504547 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-trusted-ca-bundle\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504709 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckv2m\" (UniqueName: \"kubernetes.io/projected/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-kube-api-access-ckv2m\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504709 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-oauth-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504821 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504821 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504789 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-service-ca\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504907 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504825 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-oauth-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.504907 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.504847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.605701 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.605640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-service-ca\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.605701 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.605701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-oauth-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606016 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.605821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606016 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.605870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-trusted-ca-bundle\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606016 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.605919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckv2m\" (UniqueName: \"kubernetes.io/projected/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-kube-api-access-ckv2m\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606176 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-oauth-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606176 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606463 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-oauth-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606545 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-service-ca\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606730 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-trusted-ca-bundle\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.606770 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.606736 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.608498 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.608477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-serving-cert\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.608595 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.608495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-console-oauth-config\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.616688 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.616658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckv2m\" (UniqueName: \"kubernetes.io/projected/c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c-kube-api-access-ckv2m\") pod \"console-75c8bc9448-wmpdj\" (UID: \"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c\") " pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.739038 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.738914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:20.873424 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.873399 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75c8bc9448-wmpdj"] Apr 23 18:02:20.875517 ip-10-0-131-107 kubenswrapper[2579]: W0423 18:02:20.875488 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5de42f7_41a1_4f21_9d2d_a7d6eaf8013c.slice/crio-e4efca3a63baee7273e683eb5db38142354858e94e9e80b314c4eed012090dfb WatchSource:0}: Error finding container e4efca3a63baee7273e683eb5db38142354858e94e9e80b314c4eed012090dfb: Status 404 returned error can't find the container with id e4efca3a63baee7273e683eb5db38142354858e94e9e80b314c4eed012090dfb Apr 23 18:02:20.877227 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:20.877211 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:02:21.761410 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:21.761373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75c8bc9448-wmpdj" event={"ID":"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c","Type":"ContainerStarted","Data":"b09ec15eb690c2c8555b64e88aea1450b105b3d87c407a39637d50753c28f262"} Apr 23 18:02:21.761410 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:21.761412 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75c8bc9448-wmpdj" event={"ID":"c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c","Type":"ContainerStarted","Data":"e4efca3a63baee7273e683eb5db38142354858e94e9e80b314c4eed012090dfb"} Apr 23 18:02:21.781133 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:21.781075 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75c8bc9448-wmpdj" podStartSLOduration=1.7810598579999999 podStartE2EDuration="1.781059858s" podCreationTimestamp="2026-04-23 18:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:02:21.779777939 +0000 UTC m=+1273.479378834" watchObservedRunningTime="2026-04-23 18:02:21.781059858 +0000 UTC m=+1273.480660731" Apr 23 18:02:30.740083 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:30.739985 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:30.740083 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:30.740043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:30.744993 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:30.744965 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:30.793910 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:30.793882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75c8bc9448-wmpdj" Apr 23 18:02:30.843858 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:30.843825 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 18:02:55.864526 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:55.864462 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fcd5b6589-76pgw" podUID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" containerName="console" containerID="cri-o://7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf" gracePeriod=15 Apr 23 18:02:56.106157 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.106131 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fcd5b6589-76pgw_0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac/console/0.log" Apr 23 18:02:56.106302 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.106202 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 18:02:56.122433 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122353 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122433 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122394 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n2kz\" (UniqueName: \"kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122636 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122530 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122636 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122584 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122636 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122607 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122636 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122629 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122846 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122666 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config\") pod \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\" (UID: \"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac\") " Apr 23 18:02:56.122846 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122731 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:56.123012 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.122989 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:56.123012 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.123004 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-oauth-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.123305 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.123050 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca" (OuterVolumeSpecName: "service-ca") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:56.123359 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.123297 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config" (OuterVolumeSpecName: "console-config") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:02:56.126728 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.126673 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:56.128285 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.128252 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz" (OuterVolumeSpecName: "kube-api-access-4n2kz") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "kube-api-access-4n2kz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:56.128496 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.128462 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" (UID: "0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:02:56.224212 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224171 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n2kz\" (UniqueName: \"kubernetes.io/projected/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-kube-api-access-4n2kz\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.224212 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224206 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-serving-cert\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.224212 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224216 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-trusted-ca-bundle\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.224450 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224226 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-service-ca\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.224450 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224237 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.224450 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.224245 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac-console-oauth-config\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:02:56.875853 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.875822 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fcd5b6589-76pgw_0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac/console/0.log" Apr 23 18:02:56.876349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.875865 2579 generic.go:358] "Generic (PLEG): container finished" podID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" containerID="7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf" exitCode=2 Apr 23 18:02:56.876349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.875923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcd5b6589-76pgw" event={"ID":"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac","Type":"ContainerDied","Data":"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf"} Apr 23 18:02:56.876349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.875968 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fcd5b6589-76pgw" Apr 23 18:02:56.876349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.875999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fcd5b6589-76pgw" event={"ID":"0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac","Type":"ContainerDied","Data":"b7bd90f64e4f8d5290fdab3e9c1c05a1f7dd9ea9721e60278131f39169c58a21"} Apr 23 18:02:56.876349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.876021 2579 scope.go:117] "RemoveContainer" containerID="7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf" Apr 23 18:02:56.884626 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.884600 2579 scope.go:117] "RemoveContainer" containerID="7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf" Apr 23 18:02:56.884891 ip-10-0-131-107 kubenswrapper[2579]: E0423 18:02:56.884872 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf\": container with ID starting with 7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf not found: ID does not exist" containerID="7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf" Apr 23 18:02:56.884966 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.884901 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf"} err="failed to get container status \"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf\": rpc error: code = NotFound desc = could not find container \"7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf\": container with ID starting with 7991cd39b6d4af87adb6f9816908409deafea7e56bf18326701c7cb736483daf not found: ID does not exist" Apr 23 18:02:56.896873 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.896849 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 18:02:56.900494 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.900471 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fcd5b6589-76pgw"] Apr 23 18:02:56.906435 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:02:56.906409 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" path="/var/lib/kubelet/pods/0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac/volumes" Apr 23 18:05:50.088464 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.088426 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-dnxht"] Apr 23 18:05:50.088881 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.088803 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" containerName="console" Apr 23 18:05:50.088881 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.088814 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" containerName="console" Apr 23 18:05:50.088973 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.088888 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0069f6e0-8acc-4ff5-b509-78ba3d1cb0ac" containerName="console" Apr 23 18:05:50.091919 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.091904 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dnxht" Apr 23 18:05:50.094349 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.094314 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:05:50.094579 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.094559 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:05:50.094579 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.094571 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-zkdjc\"" Apr 23 18:05:50.094989 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.094969 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:05:50.096251 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.096231 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dnxht"] Apr 23 18:05:50.206796 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.206765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nt6l\" (UniqueName: \"kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l\") pod \"s3-init-dnxht\" (UID: \"44795495-8e09-4603-b843-e3a68e9c9f16\") " pod="kserve/s3-init-dnxht" Apr 23 18:05:50.307899 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.307852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nt6l\" (UniqueName: \"kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l\") pod \"s3-init-dnxht\" (UID: \"44795495-8e09-4603-b843-e3a68e9c9f16\") " pod="kserve/s3-init-dnxht" Apr 23 18:05:50.318368 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.318338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nt6l\" (UniqueName: \"kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l\") pod \"s3-init-dnxht\" (UID: \"44795495-8e09-4603-b843-e3a68e9c9f16\") " pod="kserve/s3-init-dnxht" Apr 23 18:05:50.421110 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.421023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dnxht" Apr 23 18:05:50.543568 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:50.543541 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dnxht"] Apr 23 18:05:50.545929 ip-10-0-131-107 kubenswrapper[2579]: W0423 18:05:50.545901 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44795495_8e09_4603_b843_e3a68e9c9f16.slice/crio-8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7 WatchSource:0}: Error finding container 8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7: Status 404 returned error can't find the container with id 8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7 Apr 23 18:05:51.423031 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:51.422978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dnxht" event={"ID":"44795495-8e09-4603-b843-e3a68e9c9f16","Type":"ContainerStarted","Data":"8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7"} Apr 23 18:05:55.437003 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:55.436960 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dnxht" event={"ID":"44795495-8e09-4603-b843-e3a68e9c9f16","Type":"ContainerStarted","Data":"4c4ec5141f35462ddda52aeff4bbbb1aeb4e75c994a29d5eb51c8f73e08f3f4b"} Apr 23 18:05:55.453305 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:55.453254 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-dnxht" podStartSLOduration=1.042071463 podStartE2EDuration="5.453237936s" podCreationTimestamp="2026-04-23 18:05:50 +0000 UTC" firstStartedPulling="2026-04-23 18:05:50.547734017 +0000 UTC m=+1482.247334867" lastFinishedPulling="2026-04-23 18:05:54.95890049 +0000 UTC m=+1486.658501340" observedRunningTime="2026-04-23 18:05:55.451596577 +0000 UTC m=+1487.151197451" watchObservedRunningTime="2026-04-23 18:05:55.453237936 +0000 UTC m=+1487.152838808" Apr 23 18:05:58.448666 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:58.448573 2579 generic.go:358] "Generic (PLEG): container finished" podID="44795495-8e09-4603-b843-e3a68e9c9f16" containerID="4c4ec5141f35462ddda52aeff4bbbb1aeb4e75c994a29d5eb51c8f73e08f3f4b" exitCode=0 Apr 23 18:05:58.448666 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:58.448622 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dnxht" event={"ID":"44795495-8e09-4603-b843-e3a68e9c9f16","Type":"ContainerDied","Data":"4c4ec5141f35462ddda52aeff4bbbb1aeb4e75c994a29d5eb51c8f73e08f3f4b"} Apr 23 18:05:59.581280 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:59.581253 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dnxht" Apr 23 18:05:59.698225 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:59.698187 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nt6l\" (UniqueName: \"kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l\") pod \"44795495-8e09-4603-b843-e3a68e9c9f16\" (UID: \"44795495-8e09-4603-b843-e3a68e9c9f16\") " Apr 23 18:05:59.700456 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:59.700425 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l" (OuterVolumeSpecName: "kube-api-access-6nt6l") pod "44795495-8e09-4603-b843-e3a68e9c9f16" (UID: "44795495-8e09-4603-b843-e3a68e9c9f16"). InnerVolumeSpecName "kube-api-access-6nt6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:05:59.799314 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:05:59.799279 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6nt6l\" (UniqueName: \"kubernetes.io/projected/44795495-8e09-4603-b843-e3a68e9c9f16-kube-api-access-6nt6l\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:06:00.455452 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:00.455419 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dnxht" Apr 23 18:06:00.455628 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:00.455451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dnxht" event={"ID":"44795495-8e09-4603-b843-e3a68e9c9f16","Type":"ContainerDied","Data":"8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7"} Apr 23 18:06:00.455628 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:00.455482 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d77eda23692f5ccff3e3c9a2fbc71b9b4ef6db4daf99e395690f68b3e2569a7" Apr 23 18:06:08.883672 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:08.883544 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:06:08.889065 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:08.886569 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:06:08.889065 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:08.886793 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:06:08.889705 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:06:08.889685 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:11:08.909834 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:11:08.909733 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:11:08.914976 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:11:08.912490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:11:08.914976 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:11:08.913300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:11:08.916305 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:11:08.916281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:16:08.938327 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:16:08.938214 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:16:08.941653 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:16:08.941629 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:16:08.942207 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:16:08.942188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:16:08.944978 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:16:08.944956 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:20:26.645597 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.645560 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-shvdf/must-gather-545fw"] Apr 23 18:20:26.646062 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.645919 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44795495-8e09-4603-b843-e3a68e9c9f16" containerName="s3-init" Apr 23 18:20:26.646062 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.645930 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="44795495-8e09-4603-b843-e3a68e9c9f16" containerName="s3-init" Apr 23 18:20:26.646062 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.646012 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="44795495-8e09-4603-b843-e3a68e9c9f16" containerName="s3-init" Apr 23 18:20:26.649039 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.649022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.651471 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.651445 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-shvdf\"/\"kube-root-ca.crt\"" Apr 23 18:20:26.652056 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.652035 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-shvdf\"/\"default-dockercfg-6dmmb\"" Apr 23 18:20:26.652150 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.652035 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-shvdf\"/\"openshift-service-ca.crt\"" Apr 23 18:20:26.655055 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.655031 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-shvdf/must-gather-545fw"] Apr 23 18:20:26.755211 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.755170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.755386 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.755231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxbt\" (UniqueName: \"kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.856165 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.856125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.856341 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.856183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxbt\" (UniqueName: \"kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.856470 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.856450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.865379 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.865355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxbt\" (UniqueName: \"kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt\") pod \"must-gather-545fw\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:26.975567 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:26.975474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:27.107623 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:27.107595 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-shvdf/must-gather-545fw"] Apr 23 18:20:27.110112 ip-10-0-131-107 kubenswrapper[2579]: W0423 18:20:27.110066 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7372384_2421_43d2_baa7_e909a45f468a.slice/crio-9af46d184b8d2fc562f9e706f7404bc46aacc4c552b96fa4436a19e7e05bee6d WatchSource:0}: Error finding container 9af46d184b8d2fc562f9e706f7404bc46aacc4c552b96fa4436a19e7e05bee6d: Status 404 returned error can't find the container with id 9af46d184b8d2fc562f9e706f7404bc46aacc4c552b96fa4436a19e7e05bee6d Apr 23 18:20:27.112090 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:27.112072 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:20:27.159901 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:27.159848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-shvdf/must-gather-545fw" event={"ID":"a7372384-2421-43d2-baa7-e909a45f468a","Type":"ContainerStarted","Data":"9af46d184b8d2fc562f9e706f7404bc46aacc4c552b96fa4436a19e7e05bee6d"} Apr 23 18:20:32.178629 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:32.178594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-shvdf/must-gather-545fw" event={"ID":"a7372384-2421-43d2-baa7-e909a45f468a","Type":"ContainerStarted","Data":"163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6"} Apr 23 18:20:32.178629 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:32.178632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-shvdf/must-gather-545fw" event={"ID":"a7372384-2421-43d2-baa7-e909a45f468a","Type":"ContainerStarted","Data":"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5"} Apr 23 18:20:32.205481 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:32.205430 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-shvdf/must-gather-545fw" podStartSLOduration=1.960613102 podStartE2EDuration="6.205410854s" podCreationTimestamp="2026-04-23 18:20:26 +0000 UTC" firstStartedPulling="2026-04-23 18:20:27.112200776 +0000 UTC m=+2358.811801627" lastFinishedPulling="2026-04-23 18:20:31.356998516 +0000 UTC m=+2363.056599379" observedRunningTime="2026-04-23 18:20:32.203004061 +0000 UTC m=+2363.902604933" watchObservedRunningTime="2026-04-23 18:20:32.205410854 +0000 UTC m=+2363.905011726" Apr 23 18:20:50.240472 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:50.240434 2579 generic.go:358] "Generic (PLEG): container finished" podID="a7372384-2421-43d2-baa7-e909a45f468a" containerID="7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5" exitCode=0 Apr 23 18:20:50.240912 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:50.240494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-shvdf/must-gather-545fw" event={"ID":"a7372384-2421-43d2-baa7-e909a45f468a","Type":"ContainerDied","Data":"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5"} Apr 23 18:20:50.240912 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:50.240791 2579 scope.go:117] "RemoveContainer" containerID="7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5" Apr 23 18:20:50.424680 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:50.424650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-shvdf_must-gather-545fw_a7372384-2421-43d2-baa7-e909a45f468a/gather/0.log" Apr 23 18:20:53.727660 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:53.727626 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-26hlr_a457eef4-9f3f-4f73-9ca1-5086f6adf4d5/global-pull-secret-syncer/0.log" Apr 23 18:20:53.979422 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:53.979340 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-d8twf_b2c8d6fa-fde8-484b-bc81-f7412492a7fa/konnectivity-agent/0.log" Apr 23 18:20:54.052957 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:54.052912 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-107.ec2.internal_ddd1d509744edf6dd1d0f6ae52b4d7c3/haproxy/0.log" Apr 23 18:20:55.797387 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:55.797345 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-shvdf/must-gather-545fw"] Apr 23 18:20:55.797994 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:55.797652 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-shvdf/must-gather-545fw" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="copy" containerID="cri-o://163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6" gracePeriod=2 Apr 23 18:20:55.799099 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:55.799076 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-shvdf/must-gather-545fw"] Apr 23 18:20:55.799454 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:55.799423 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7372384-2421-43d2-baa7-e909a45f468a" pod="openshift-must-gather-shvdf/must-gather-545fw" err="pods \"must-gather-545fw\" is forbidden: User \"system:node:ip-10-0-131-107.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-shvdf\": no relationship found between node 'ip-10-0-131-107.ec2.internal' and this object" Apr 23 18:20:56.033098 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.033073 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-shvdf_must-gather-545fw_a7372384-2421-43d2-baa7-e909a45f468a/copy/0.log" Apr 23 18:20:56.033480 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.033464 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:56.035372 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.035350 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7372384-2421-43d2-baa7-e909a45f468a" pod="openshift-must-gather-shvdf/must-gather-545fw" err="pods \"must-gather-545fw\" is forbidden: User \"system:node:ip-10-0-131-107.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-shvdf\": no relationship found between node 'ip-10-0-131-107.ec2.internal' and this object" Apr 23 18:20:56.229642 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.229541 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output\") pod \"a7372384-2421-43d2-baa7-e909a45f468a\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " Apr 23 18:20:56.229642 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.229587 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxbt\" (UniqueName: \"kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt\") pod \"a7372384-2421-43d2-baa7-e909a45f468a\" (UID: \"a7372384-2421-43d2-baa7-e909a45f468a\") " Apr 23 18:20:56.230748 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.230720 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a7372384-2421-43d2-baa7-e909a45f468a" (UID: "a7372384-2421-43d2-baa7-e909a45f468a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:20:56.231883 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.231862 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt" (OuterVolumeSpecName: "kube-api-access-kgxbt") pod "a7372384-2421-43d2-baa7-e909a45f468a" (UID: "a7372384-2421-43d2-baa7-e909a45f468a"). InnerVolumeSpecName "kube-api-access-kgxbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:20:56.260412 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.260380 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-shvdf_must-gather-545fw_a7372384-2421-43d2-baa7-e909a45f468a/copy/0.log" Apr 23 18:20:56.260759 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.260728 2579 generic.go:358] "Generic (PLEG): container finished" podID="a7372384-2421-43d2-baa7-e909a45f468a" containerID="163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6" exitCode=143 Apr 23 18:20:56.260877 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.260776 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-shvdf/must-gather-545fw" Apr 23 18:20:56.260877 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.260838 2579 scope.go:117] "RemoveContainer" containerID="163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6" Apr 23 18:20:56.263812 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.263782 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7372384-2421-43d2-baa7-e909a45f468a" pod="openshift-must-gather-shvdf/must-gather-545fw" err="pods \"must-gather-545fw\" is forbidden: User \"system:node:ip-10-0-131-107.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-shvdf\": no relationship found between node 'ip-10-0-131-107.ec2.internal' and this object" Apr 23 18:20:56.269410 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.269389 2579 scope.go:117] "RemoveContainer" containerID="7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5" Apr 23 18:20:56.271864 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.271841 2579 status_manager.go:895] "Failed to get status for pod" podUID="a7372384-2421-43d2-baa7-e909a45f468a" pod="openshift-must-gather-shvdf/must-gather-545fw" err="pods \"must-gather-545fw\" is forbidden: User \"system:node:ip-10-0-131-107.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-shvdf\": no relationship found between node 'ip-10-0-131-107.ec2.internal' and this object" Apr 23 18:20:56.281613 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.281596 2579 scope.go:117] "RemoveContainer" containerID="163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6" Apr 23 18:20:56.281843 ip-10-0-131-107 kubenswrapper[2579]: E0423 18:20:56.281821 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6\": container with ID starting with 163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6 not found: ID does not exist" containerID="163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6" Apr 23 18:20:56.281892 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.281851 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6"} err="failed to get container status \"163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6\": rpc error: code = NotFound desc = could not find container \"163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6\": container with ID starting with 163b857dc654aff2f87659fc6364e4b7fe0e48f5420efa25494e51a63826ddf6 not found: ID does not exist" Apr 23 18:20:56.281892 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.281870 2579 scope.go:117] "RemoveContainer" containerID="7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5" Apr 23 18:20:56.282131 ip-10-0-131-107 kubenswrapper[2579]: E0423 18:20:56.282113 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5\": container with ID starting with 7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5 not found: ID does not exist" containerID="7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5" Apr 23 18:20:56.282176 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.282137 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5"} err="failed to get container status \"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5\": rpc error: code = NotFound desc = could not find container \"7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5\": container with ID starting with 7466a8cde95e5223d508c717138fb10558e94b56e7d2a72d45fd717b27af9cf5 not found: ID does not exist" Apr 23 18:20:56.330671 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.330634 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7372384-2421-43d2-baa7-e909a45f468a-must-gather-output\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:20:56.330671 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.330667 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgxbt\" (UniqueName: \"kubernetes.io/projected/a7372384-2421-43d2-baa7-e909a45f468a-kube-api-access-kgxbt\") on node \"ip-10-0-131-107.ec2.internal\" DevicePath \"\"" Apr 23 18:20:56.906449 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:56.906415 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7372384-2421-43d2-baa7-e909a45f468a" path="/var/lib/kubelet/pods/a7372384-2421-43d2-baa7-e909a45f468a/volumes" Apr 23 18:20:57.445748 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.445725 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/alertmanager/0.log" Apr 23 18:20:57.475977 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.475926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/config-reloader/0.log" Apr 23 18:20:57.508791 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.508765 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/kube-rbac-proxy-web/0.log" Apr 23 18:20:57.546217 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.546185 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/kube-rbac-proxy/0.log" Apr 23 18:20:57.571825 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.571799 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/kube-rbac-proxy-metric/0.log" Apr 23 18:20:57.601310 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.601286 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/prom-label-proxy/0.log" Apr 23 18:20:57.627894 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.627869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3808bbb3-777a-4aa1-b625-18d13b2fef7e/init-config-reloader/0.log" Apr 23 18:20:57.668502 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.668470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8hmmb_a3ef0c5c-c346-4ab2-8f0b-127c98996cb5/cluster-monitoring-operator/0.log" Apr 23 18:20:57.700821 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.700749 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ldjsl_d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68/kube-state-metrics/0.log" Apr 23 18:20:57.729421 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.729375 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ldjsl_d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68/kube-rbac-proxy-main/0.log" Apr 23 18:20:57.758231 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.758199 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ldjsl_d00b29bf-616c-4a9e-9e3a-3c4c49b1dd68/kube-rbac-proxy-self/0.log" Apr 23 18:20:57.788128 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.788101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6dc9db7797-6hfqn_fada3561-13e0-4497-94b2-f78573fc03cc/metrics-server/0.log" Apr 23 18:20:57.823649 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.823624 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xjczr_d8b9985d-f43d-498a-89a5-e00d324d21e2/monitoring-plugin/0.log" Apr 23 18:20:57.940853 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.940828 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nnznq_5548baa4-2eb8-441d-a84d-db3b7a2b5e6e/node-exporter/0.log" Apr 23 18:20:57.965606 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.965533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nnznq_5548baa4-2eb8-441d-a84d-db3b7a2b5e6e/kube-rbac-proxy/0.log" Apr 23 18:20:57.989518 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:57.989494 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nnznq_5548baa4-2eb8-441d-a84d-db3b7a2b5e6e/init-textfile/0.log" Apr 23 18:20:58.103641 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.103613 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n6zwj_2ea54b50-dbce-4156-ab82-8df3cfa10a71/kube-rbac-proxy-main/0.log" Apr 23 18:20:58.132782 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.132743 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n6zwj_2ea54b50-dbce-4156-ab82-8df3cfa10a71/kube-rbac-proxy-self/0.log" Apr 23 18:20:58.158511 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.158486 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n6zwj_2ea54b50-dbce-4156-ab82-8df3cfa10a71/openshift-state-metrics/0.log" Apr 23 18:20:58.212975 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.212926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/prometheus/0.log" Apr 23 18:20:58.235248 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.235166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/config-reloader/0.log" Apr 23 18:20:58.266517 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.266487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/thanos-sidecar/0.log" Apr 23 18:20:58.295248 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.295216 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/kube-rbac-proxy-web/0.log" Apr 23 18:20:58.324676 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.324636 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/kube-rbac-proxy/0.log" Apr 23 18:20:58.351696 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.351660 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/kube-rbac-proxy-thanos/0.log" Apr 23 18:20:58.379292 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.379265 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_464064e6-cf95-41e9-a8bb-2f29be481bc8/init-config-reloader/0.log" Apr 23 18:20:58.500926 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.500833 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d98bc84f8-p6fv8_bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1/telemeter-client/0.log" Apr 23 18:20:58.529123 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.529093 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d98bc84f8-p6fv8_bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1/reload/0.log" Apr 23 18:20:58.555274 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.555247 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d98bc84f8-p6fv8_bb1a8e7a-74a5-4fce-ad08-c6a765d4adc1/kube-rbac-proxy/0.log" Apr 23 18:20:58.590612 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.590563 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/thanos-query/0.log" Apr 23 18:20:58.643012 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.642981 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/kube-rbac-proxy-web/0.log" Apr 23 18:20:58.705233 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.705207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/kube-rbac-proxy/0.log" Apr 23 18:20:58.779175 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.779145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/prom-label-proxy/0.log" Apr 23 18:20:58.818039 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.818013 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/kube-rbac-proxy-rules/0.log" Apr 23 18:20:58.867020 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:20:58.866989 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6885598b6-lfxsh_3e23db1e-dda5-495a-9cdb-d49b902d0e8f/kube-rbac-proxy-metrics/0.log" Apr 23 18:21:00.470544 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:00.470456 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:21:00.475531 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:00.475508 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/3.log" Apr 23 18:21:00.863025 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:00.862994 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75c8bc9448-wmpdj_c5de42f7-41a1-4f21-9d2d-a7d6eaf8013c/console/0.log" Apr 23 18:21:01.074020 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.073982 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj"] Apr 23 18:21:01.074377 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074364 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="gather" Apr 23 18:21:01.074420 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074381 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="gather" Apr 23 18:21:01.074420 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074401 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="copy" Apr 23 18:21:01.074420 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074408 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="copy" Apr 23 18:21:01.074510 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074487 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="copy" Apr 23 18:21:01.074510 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.074498 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7372384-2421-43d2-baa7-e909a45f468a" containerName="gather" Apr 23 18:21:01.077006 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.076984 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.079075 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.079052 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gtcqp\"/\"default-dockercfg-qkf26\"" Apr 23 18:21:01.079198 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.079052 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gtcqp\"/\"openshift-service-ca.crt\"" Apr 23 18:21:01.079556 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.079541 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gtcqp\"/\"kube-root-ca.crt\"" Apr 23 18:21:01.087169 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.087146 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj"] Apr 23 18:21:01.173722 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.173627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-podres\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.173722 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.173673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-proc\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.173722 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.173701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-lib-modules\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.173973 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.173738 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-sys\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.173973 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.173807 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssg8\" (UniqueName: \"kubernetes.io/projected/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-kube-api-access-gssg8\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274612 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-podres\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274612 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-proc\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274819 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-proc\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274819 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-lib-modules\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274819 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-sys\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274819 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-podres\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.274819 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-sys\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.275066 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-lib-modules\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.275066 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.274854 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gssg8\" (UniqueName: \"kubernetes.io/projected/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-kube-api-access-gssg8\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.283278 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.283245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssg8\" (UniqueName: \"kubernetes.io/projected/5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f-kube-api-access-gssg8\") pod \"perf-node-gather-daemonset-xcfcj\" (UID: \"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f\") " pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.387768 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.387736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:01.510459 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:01.510432 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj"] Apr 23 18:21:01.512867 ip-10-0-131-107 kubenswrapper[2579]: W0423 18:21:01.512841 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5ef9fc2e_21f8_4d2b_8a60_3d6382a69b2f.slice/crio-f9f4c3250970a0cbd1e0b9ab06e005289a06979f22241911bad9f0a3ea64e728 WatchSource:0}: Error finding container f9f4c3250970a0cbd1e0b9ab06e005289a06979f22241911bad9f0a3ea64e728: Status 404 returned error can't find the container with id f9f4c3250970a0cbd1e0b9ab06e005289a06979f22241911bad9f0a3ea64e728 Apr 23 18:21:02.084909 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.084878 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7trf4_d9288ab5-ccb4-416b-aa52-180278252652/dns/0.log" Apr 23 18:21:02.110316 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.110291 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7trf4_d9288ab5-ccb4-416b-aa52-180278252652/kube-rbac-proxy/0.log" Apr 23 18:21:02.284438 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.284397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" event={"ID":"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f","Type":"ContainerStarted","Data":"6c12200ba18d041cade515fc15908bff72d7c86bd60724bf1c1ceee308ef3e1e"} Apr 23 18:21:02.284438 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.284436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" event={"ID":"5ef9fc2e-21f8-4d2b-8a60-3d6382a69b2f","Type":"ContainerStarted","Data":"f9f4c3250970a0cbd1e0b9ab06e005289a06979f22241911bad9f0a3ea64e728"} Apr 23 18:21:02.284650 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.284534 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:02.303848 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.303791 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" podStartSLOduration=1.303774951 podStartE2EDuration="1.303774951s" podCreationTimestamp="2026-04-23 18:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:21:02.30127409 +0000 UTC m=+2394.000874974" watchObservedRunningTime="2026-04-23 18:21:02.303774951 +0000 UTC m=+2394.003375848" Apr 23 18:21:02.305127 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.305110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-thrhm_994fa59f-956e-4d6b-8074-e3d2459771d9/dns-node-resolver/0.log" Apr 23 18:21:02.790019 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:02.789993 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jldlx_672ee16e-ccf3-47b3-a727-a91e0e7a9fbc/node-ca/0.log" Apr 23 18:21:04.002765 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.002736 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-g4zr9_abe92563-1ac2-4f25-b349-ffe1fce0022f/serve-healthcheck-canary/0.log" Apr 23 18:21:04.408187 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.408156 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dsmq2_fc12dc89-ceae-445a-92dc-0ec601992482/insights-operator/0.log" Apr 23 18:21:04.408836 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.408819 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dsmq2_fc12dc89-ceae-445a-92dc-0ec601992482/insights-operator/1.log" Apr 23 18:21:04.507549 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.507523 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89vtf_f28b3e97-3375-41b6-8e26-7f03bbe44a6c/kube-rbac-proxy/0.log" Apr 23 18:21:04.533182 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.533152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89vtf_f28b3e97-3375-41b6-8e26-7f03bbe44a6c/exporter/0.log" Apr 23 18:21:04.559584 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:04.559548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89vtf_f28b3e97-3375-41b6-8e26-7f03bbe44a6c/extractor/0.log" Apr 23 18:21:06.820872 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:06.820842 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-dnxht_44795495-8e09-4603-b843-e3a68e9c9f16/s3-init/0.log" Apr 23 18:21:08.298450 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:08.298423 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gtcqp/perf-node-gather-daemonset-xcfcj" Apr 23 18:21:08.964383 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:08.964266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:21:08.979951 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:08.970752 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:21:08.979951 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:08.970781 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-d79k5_0e6eb10c-aea8-4862-8cc7-ecd18b5f6498/console-operator/2.log" Apr 23 18:21:08.979951 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:08.974321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:21:11.153961 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:11.153913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4h56m_1cdd9b61-36a1-469f-97dc-02d5fadd2316/migrator/0.log" Apr 23 18:21:11.181691 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:11.181643 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4h56m_1cdd9b61-36a1-469f-97dc-02d5fadd2316/graceful-termination/0.log" Apr 23 18:21:12.707605 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.707578 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/kube-multus-additional-cni-plugins/0.log" Apr 23 18:21:12.736696 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.736671 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/egress-router-binary-copy/0.log" Apr 23 18:21:12.769444 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.769415 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/cni-plugins/0.log" Apr 23 18:21:12.794559 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.794533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/bond-cni-plugin/0.log" Apr 23 18:21:12.819283 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.819256 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/routeoverride-cni/0.log" Apr 23 18:21:12.845870 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.845842 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/whereabouts-cni-bincopy/0.log" Apr 23 18:21:12.869815 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:12.869789 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qnskt_5b02fbde-34d7-498c-ba52-33a9307442e3/whereabouts-cni/0.log" Apr 23 18:21:13.108645 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:13.108610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w5pwh_2a3e43e9-302e-40d9-ac72-286f11253b7c/kube-multus/0.log" Apr 23 18:21:13.137059 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:13.137025 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-45qq7_51d0c740-3b6f-4927-90d0-03577afcf352/network-metrics-daemon/0.log" Apr 23 18:21:13.160538 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:13.160512 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-45qq7_51d0c740-3b6f-4927-90d0-03577afcf352/kube-rbac-proxy/0.log" Apr 23 18:21:14.646337 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.646308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-controller/0.log" Apr 23 18:21:14.671238 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.671213 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/0.log" Apr 23 18:21:14.682455 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.682434 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovn-acl-logging/1.log" Apr 23 18:21:14.706122 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.706100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/kube-rbac-proxy-node/0.log" Apr 23 18:21:14.734862 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.734834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:21:14.762725 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.762699 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/northd/0.log" Apr 23 18:21:14.794171 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.794143 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/nbdb/0.log" Apr 23 18:21:14.885295 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:14.885257 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/sbdb/0.log" Apr 23 18:21:15.069431 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:15.069400 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zxwk2_b82ed798-7ebc-40ee-8b35-03a742ad4e5e/ovnkube-controller/0.log" Apr 23 18:21:16.182515 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:16.182478 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-v5kgg_45adc12a-53d2-46dc-a15b-d354387909c2/check-endpoints/0.log" Apr 23 18:21:16.238147 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:16.238112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-h7vln_7f6fcfee-1046-4e89-a5c4-e3d550d4056f/network-check-target-container/0.log" Apr 23 18:21:17.349343 ip-10-0-131-107 kubenswrapper[2579]: I0423 18:21:17.349312 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hnppj_139eafee-efc6-482a-9490-90498d03dec5/iptables-alerter/0.log"